PRlinxiulei: Pass PodSandboxConfig to PullImage method in CRI
ResultFAILURE
Tests 1 failed / 578 succeeded
Started2018-12-07 02:12
Elapsed27m51s
Versionv1.14.0-alpha.0.895+72d903cf0d5c7d
Buildergke-prow-default-pool-3c8994a8-n2hc
Refs master:1cd6ccb3
71764:5e2ed11c
pod59cab94f-f9c5-11e8-a62b-0a580a6c02d7
infra-commitd6f7bb8bf
pod59cab94f-f9c5-11e8-a62b-0a580a6c02d7
repok8s.io/kubernetes
repo-commit72d903cf0d5c7db013174728826fe240ae727556
repos{u'k8s.io/kubernetes': u'master:1cd6ccb34458def1347ae96b2e8aacb5338f8e1d,71764:5e2ed11cf7d4bbbbb811f5c6ed492d35cb9561cf'}

Test Failures


k8s.io/kubernetes/test/integration/auth TestAuthModeAlwaysAllow 3.77s

go test -v k8s.io/kubernetes/test/integration/auth -run TestAuthModeAlwaysAllow$
I1207 02:28:47.515447  117028 services.go:33] Network range for service cluster IPs is unspecified. Defaulting to {10.0.0.0 ffffff00}.
I1207 02:28:47.515492  117028 master.go:272] Node port range unspecified. Defaulting to 30000-32767.
I1207 02:28:47.515506  117028 master.go:228] Using reconciler: 
I1207 02:28:47.517562  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.517611  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.517723  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.517784  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.518335  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.519009  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.519077  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.519196  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.519960  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.520481  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.520502  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.520519  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.520567  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.520649  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.521006  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.521406  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.521463  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.521587  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.521649  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.522066  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.522821  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.522844  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.522875  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.522939  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.530173  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.531054  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.531088  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.531161  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.531493  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.534122  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.534177  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.534220  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.534362  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.534622  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.535456  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.536190  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.536219  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.536270  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.536343  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.536663  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.537078  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.537104  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.537172  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.537520  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.537848  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.538754  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.538784  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.538824  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.538917  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.539682  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.540081  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.540098  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.540150  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.540402  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.540755  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.541225  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.542029  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.542092  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.542174  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.542467  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.542971  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.542996  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.543026  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.543099  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.543422  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.543870  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.543893  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.543926  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.544095  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.544894  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.545630  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.545688  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.546240  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.546511  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.547056  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.547090  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.547108  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.547162  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.547248  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.547764  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.547790  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.547825  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.547950  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.547989  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.552192  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.567405  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.567454  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.567506  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.567643  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.568621  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.568655  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.568738  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.568869  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.569119  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.570667  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.571434  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.571462  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.571501  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.571594  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.572354  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.572387  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.572438  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.572568  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.572906  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.574817  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.581323  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.581362  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.582907  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.583249  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.588279  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.595773  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.595807  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.595868  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.595938  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.596572  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.598065  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.598093  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.598191  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.598696  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.599886  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.599906  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.599942  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.600065  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.600362  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.603328  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.609646  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.609688  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.609769  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.610229  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.612104  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.612213  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.612327  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.612525  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.612959  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.614109  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.615210  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.614900  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.615316  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.615868  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.616915  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.616989  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.617083  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.617389  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.617762  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.618878  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.631368  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.632648  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.632813  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.632897  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.637354  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.637394  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.637444  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.637490  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.637800  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.641001  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.641682  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.642871  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.642986  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.644454  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.645547  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.645809  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.645895  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.646005  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.646266  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.647119  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.647671  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.647774  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.647879  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.648493  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.650189  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.650767  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.650655  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.652276  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.652378  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.653046  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.653548  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.653698  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.653967  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.654303  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.654974  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.655014  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.655065  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.655178  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.655288  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.656109  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.656155  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.656196  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.656270  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.656459  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.658419  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.659237  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.659302  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.659385  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.659507  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.660264  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.660615  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.661230  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.661312  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.661396  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.662734  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.663374  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.663477  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.663567  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.663916  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.665338  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.666457  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.665591  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.666640  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.666890  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.667459  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.671220  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.671300  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.671381  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.671453  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.672568  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.673291  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.673380  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.673486  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.673809  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.676956  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.679573  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.679679  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.679818  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.679981  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.681435  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.683287  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.683467  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.683679  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.681941  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.686108  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.686229  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.686342  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.686586  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.687080  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.688617  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.691292  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.688911  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.703064  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.703350  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.704217  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.718648  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.718764  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.718858  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.718978  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.720819  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.720915  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.720999  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.721174  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.721466  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.722875  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.722953  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.723034  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.723185  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.723464  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.724662  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.724752  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.724828  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.724963  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.725245  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.726557  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.726620  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.726714  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.726879  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.727162  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.730002  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.730028  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.730064  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.730101  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.730364  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.732628  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.732652  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.732688  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.732807  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.733025  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.733852  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.733869  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.733901  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.734002  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.734254  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.734683  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.734901  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.734920  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.734950  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.734998  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.735937  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.735963  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.735995  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.736063  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.736327  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.736948  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.736963  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.736991  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.737062  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.737280  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.737828  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.737844  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.737871  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.737978  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.738182  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.748919  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.749037  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.749151  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.749313  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.749656  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.750359  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.750427  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.750506  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.750657  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.750952  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.751333  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.751671  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:47.751719  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:47.751759  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:47.751830  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:47.754857  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1207 02:28:47.761215  117028 genericapiserver.go:334] Skipping API batch/v2alpha1 because it has no resources.
W1207 02:28:47.775616  117028 genericapiserver.go:334] Skipping API rbac.authorization.k8s.io/v1alpha1 because it has no resources.
W1207 02:28:47.776608  117028 genericapiserver.go:334] Skipping API scheduling.k8s.io/v1alpha1 because it has no resources.
W1207 02:28:47.779022  117028 genericapiserver.go:334] Skipping API storage.k8s.io/v1alpha1 because it has no resources.
W1207 02:28:47.793304  117028 genericapiserver.go:334] Skipping API admissionregistration.k8s.io/v1alpha1 because it has no resources.
I1207 02:28:48.515310  117028 clientconn.go:551] parsed scheme: ""
I1207 02:28:48.515355  117028 clientconn.go:557] scheme "" not registered, fallback to default scheme
I1207 02:28:48.515406  117028 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
I1207 02:28:48.515473  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:48.515993  117028 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
I1207 02:28:48.801000  117028 storage_scheduling.go:91] created PriorityClass system-node-critical with value 2000001000
I1207 02:28:48.805837  117028 storage_scheduling.go:91] created PriorityClass system-cluster-critical with value 2000000000
I1207 02:28:48.805871  117028 storage_scheduling.go:100] all system priority classes are created successfully or already exist.
I1207 02:28:48.812230  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/cluster-admin
I1207 02:28:48.815550  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:discovery
I1207 02:28:48.819478  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:basic-user
I1207 02:28:48.823570  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/admin
I1207 02:28:48.827268  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/edit
I1207 02:28:48.831185  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/view
I1207 02:28:48.835216  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:aggregate-to-admin
I1207 02:28:48.839916  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:aggregate-to-edit
I1207 02:28:48.844687  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:aggregate-to-view
I1207 02:28:48.849392  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:heapster
I1207 02:28:48.853891  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:node
I1207 02:28:48.858284  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:node-problem-detector
I1207 02:28:48.862108  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:node-proxier
I1207 02:28:48.865764  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:kubelet-api-admin
I1207 02:28:48.869345  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:node-bootstrapper
I1207 02:28:48.872871  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:auth-delegator
I1207 02:28:48.877805  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:kube-aggregator
I1207 02:28:48.881368  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:kube-controller-manager
I1207 02:28:48.885181  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:kube-scheduler
I1207 02:28:48.888742  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:kube-dns
I1207 02:28:48.892427  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:persistent-volume-provisioner
I1207 02:28:48.896312  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:csi-external-attacher
I1207 02:28:48.900001  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:aws-cloud-provider
I1207 02:28:48.903495  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:certificates.k8s.io:certificatesigningrequests:nodeclient
I1207 02:28:48.906586  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:certificates.k8s.io:certificatesigningrequests:selfnodeclient
I1207 02:28:48.909781  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:volume-scheduler
I1207 02:28:48.913322  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:csi-external-provisioner
I1207 02:28:48.916884  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:attachdetach-controller
I1207 02:28:48.920196  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:clusterrole-aggregation-controller
I1207 02:28:48.923752  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:cronjob-controller
I1207 02:28:48.927162  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:daemon-set-controller
I1207 02:28:48.932191  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:deployment-controller
I1207 02:28:48.936504  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:disruption-controller
I1207 02:28:48.941211  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:endpoint-controller
I1207 02:28:48.944806  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:expand-controller
I1207 02:28:48.948449  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:generic-garbage-collector
I1207 02:28:48.952235  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:horizontal-pod-autoscaler
I1207 02:28:48.956572  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:job-controller
I1207 02:28:48.960011  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:namespace-controller
I1207 02:28:48.963069  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:node-controller
I1207 02:28:48.966583  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:persistent-volume-binder
I1207 02:28:48.969800  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:pod-garbage-collector
I1207 02:28:48.973272  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:replicaset-controller
I1207 02:28:48.976628  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:replication-controller
I1207 02:28:48.979808  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:resourcequota-controller
I1207 02:28:48.982835  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:route-controller
I1207 02:28:48.986576  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:service-account-controller
I1207 02:28:48.990629  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:service-controller
I1207 02:28:48.993969  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:statefulset-controller
I1207 02:28:48.997679  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:ttl-controller
I1207 02:28:49.001334  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:certificate-controller
I1207 02:28:49.039188  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:pvc-protection-controller
I1207 02:28:49.079291  117028 storage_rbac.go:187] created clusterrole.rbac.authorization.k8s.io/system:controller:pv-protection-controller
I1207 02:28:49.119452  117028 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/cluster-admin
I1207 02:28:49.158953  117028 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:discovery
I1207 02:28:49.199686  117028 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:basic-user
I1207 02:28:49.239239  117028 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:node-proxier
I1207 02:28:49.289961  117028 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:kube-controller-manager
I1207 02:28:49.319296  117028 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:kube-dns
I1207 02:28:49.359306  117028 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:kube-scheduler
I1207 02:28:49.399478  117028 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:aws-cloud-provider
I1207 02:28:49.439077  117028 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:node
I1207 02:28:49.479187  117028 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:volume-scheduler
I1207 02:28:49.520388  117028 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:attachdetach-controller
I1207 02:28:49.559119  117028 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:clusterrole-aggregation-controller
I1207 02:28:49.599289  117028 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:cronjob-controller
I1207 02:28:49.640067  117028 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:daemon-set-controller
I1207 02:28:49.679259  117028 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:deployment-controller
I1207 02:28:49.719102  117028 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:disruption-controller
I1207 02:28:49.758992  117028 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:endpoint-controller
I1207 02:28:49.799918  117028 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:expand-controller
I1207 02:28:49.838909  117028 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:generic-garbage-collector
I1207 02:28:49.879409  117028 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:horizontal-pod-autoscaler
I1207 02:28:49.918765  117028 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:job-controller
I1207 02:28:49.959598  117028 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:namespace-controller
I1207 02:28:49.999023  117028 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:node-controller
I1207 02:28:50.038885  117028 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:persistent-volume-binder
I1207 02:28:50.079227  117028 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:pod-garbage-collector
I1207 02:28:50.119475  117028 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:replicaset-controller
I1207 02:28:50.159287  117028 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:replication-controller
I1207 02:28:50.199297  117028 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:resourcequota-controller
I1207 02:28:50.238892  117028 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:route-controller
I1207 02:28:50.279223  117028 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:service-account-controller
I1207 02:28:50.319556  117028 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:service-controller
I1207 02:28:50.359371  117028 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:statefulset-controller
I1207 02:28:50.399084  117028 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:ttl-controller
I1207 02:28:50.440309  117028 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:certificate-controller
I1207 02:28:50.479270  117028 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:pvc-protection-controller
I1207 02:28:50.518937  117028 storage_rbac.go:215] created clusterrolebinding.rbac.authorization.k8s.io/system:controller:pv-protection-controller
I1207 02:28:50.559177  117028 storage_rbac.go:246] created role.rbac.authorization.k8s.io/extension-apiserver-authentication-reader in kube-system
I1207 02:28:50.598891  117028 storage_rbac.go:246] created role.rbac.authorization.k8s.io/system:controller:bootstrap-signer in kube-system
I1207 02:28:50.639786  117028 storage_rbac.go:246] created role.rbac.authorization.k8s.io/system:controller:cloud-provider in kube-system
I1207 02:28:50.679224  117028 storage_rbac.go:246] created role.rbac.authorization.k8s.io/system:controller:token-cleaner in kube-system
I1207 02:28:50.719388  117028 storage_rbac.go:246] created role.rbac.authorization.k8s.io/system::leader-locking-kube-controller-manager in kube-system
I1207 02:28:50.758736  117028 storage_rbac.go:246] created role.rbac.authorization.k8s.io/system::leader-locking-kube-scheduler in kube-system
I1207 02:28:50.798943  117028 storage_rbac.go:246] created role.rbac.authorization.k8s.io/system:controller:bootstrap-signer in kube-public
I1207 02:28:50.839730  117028 storage_rbac.go:276] created rolebinding.rbac.authorization.k8s.io/system::leader-locking-kube-controller-manager in kube-system
I1207 02:28:50.898850  117028 storage_rbac.go:276] created rolebinding.rbac.authorization.k8s.io/system::leader-locking-kube-scheduler in kube-system
I1207 02:28:50.918977  117028 storage_rbac.go:276] created rolebinding.rbac.authorization.k8s.io/system:controller:bootstrap-signer in kube-system
I1207 02:28:50.958898  117028 storage_rbac.go:276] created rolebinding.rbac.authorization.k8s.io/system:controller:cloud-provider in kube-system
I1207 02:28:50.999445  117028 storage_rbac.go:276] created rolebinding.rbac.authorization.k8s.io/system:controller:token-cleaner in kube-system
I1207 02:28:51.038858  117028 storage_rbac.go:276] created rolebinding.rbac.authorization.k8s.io/system:controller:bootstrap-signer in kube-public
I1207 02:28:51.284372  117028 controller.go:170] Shutting down kubernetes service endpoint reconciler
				from junit_f5a444384056ebac4f2929ce7b7920ea9733ca19_20181207-022709.xml

Filter through log files | View test history on testgrid


Show 578 Passed Tests

Show 4 Skipped Tests

Error lines from build-log.txt

... skipping 10 lines ...
I1207 02:12:23.992] process 213 exited with code 0 after 0.3m
I1207 02:12:23.993] Call:  gcloud config get-value account
I1207 02:12:24.497] process 226 exited with code 0 after 0.0m
I1207 02:12:24.497] Will upload results to gs://kubernetes-jenkins/pr-logs using pr-kubekins@kubernetes-jenkins-pull.iam.gserviceaccount.com
I1207 02:12:24.497] Call:  kubectl get -oyaml pods/59cab94f-f9c5-11e8-a62b-0a580a6c02d7
W1207 02:12:33.061] The connection to the server localhost:8080 was refused - did you specify the right host or port?
E1207 02:12:33.064] Command failed
I1207 02:12:33.064] process 239 exited with code 1 after 0.1m
E1207 02:12:33.064] unable to upload podspecs: Command '['kubectl', 'get', '-oyaml', 'pods/59cab94f-f9c5-11e8-a62b-0a580a6c02d7']' returned non-zero exit status 1
I1207 02:12:33.064] Root: /workspace
I1207 02:12:33.064] cd to /workspace
I1207 02:12:33.065] Checkout: /workspace/k8s.io/kubernetes master:1cd6ccb34458def1347ae96b2e8aacb5338f8e1d,71764:5e2ed11cf7d4bbbbb811f5c6ed492d35cb9561cf to /workspace/k8s.io/kubernetes
I1207 02:12:33.065] Call:  git init k8s.io/kubernetes
... skipping 859 lines ...
W1207 02:21:55.206] I1207 02:21:55.204390   55610 resource_quota_monitor.go:228] QuotaMonitor created object count evaluator for horizontalpodautoscalers.autoscaling
W1207 02:21:55.207] I1207 02:21:55.204461   55610 resource_quota_monitor.go:228] QuotaMonitor created object count evaluator for podtemplates
W1207 02:21:55.207] I1207 02:21:55.204522   55610 resource_quota_monitor.go:228] QuotaMonitor created object count evaluator for statefulsets.apps
W1207 02:21:55.207] I1207 02:21:55.204557   55610 resource_quota_monitor.go:228] QuotaMonitor created object count evaluator for controllerrevisions.apps
W1207 02:21:55.208] I1207 02:21:55.204596   55610 resource_quota_monitor.go:228] QuotaMonitor created object count evaluator for endpoints
W1207 02:21:55.208] I1207 02:21:55.204626   55610 resource_quota_monitor.go:228] QuotaMonitor created object count evaluator for replicasets.extensions
W1207 02:21:55.208] E1207 02:21:55.204671   55610 resource_quota_controller.go:171] initial monitor sync has error: couldn't start monitor for resource "extensions/v1beta1, Resource=networkpolicies": unable to monitor quota for resource "extensions/v1beta1, Resource=networkpolicies"
W1207 02:21:55.208] I1207 02:21:55.204746   55610 controllermanager.go:516] Started "resourcequota"
W1207 02:21:55.209] I1207 02:21:55.204785   55610 resource_quota_controller.go:276] Starting resource quota controller
W1207 02:21:55.209] I1207 02:21:55.204968   55610 controller_utils.go:1027] Waiting for caches to sync for resource quota controller
W1207 02:21:55.209] I1207 02:21:55.205029   55610 resource_quota_monitor.go:301] QuotaMonitor running
W1207 02:21:55.209] I1207 02:21:55.205382   55610 controllermanager.go:516] Started "pv-protection"
W1207 02:21:55.209] I1207 02:21:55.205505   55610 pv_protection_controller.go:81] Starting PV protection controller
... skipping 12 lines ...
W1207 02:21:55.212] I1207 02:21:55.207887   55610 gc_controller.go:76] Starting GC controller
W1207 02:21:55.213] I1207 02:21:55.207906   55610 controller_utils.go:1027] Waiting for caches to sync for GC controller
W1207 02:21:55.213] I1207 02:21:55.208076   55610 controllermanager.go:516] Started "csrcleaner"
W1207 02:21:55.213] W1207 02:21:55.208093   55610 controllermanager.go:495] "bootstrapsigner" is disabled
W1207 02:21:55.213] W1207 02:21:55.208101   55610 controllermanager.go:508] Skipping "nodeipam"
W1207 02:21:55.213] I1207 02:21:55.208334   55610 cleaner.go:81] Starting CSR cleaner controller
W1207 02:21:55.214] E1207 02:21:55.208679   55610 core.go:76] Failed to start service controller: WARNING: no cloud provider provided, services of type LoadBalancer will fail
W1207 02:21:55.214] W1207 02:21:55.208758   55610 controllermanager.go:508] Skipping "service"
W1207 02:21:55.214] I1207 02:21:55.209287   55610 controllermanager.go:516] Started "serviceaccount"
W1207 02:21:55.214] I1207 02:21:55.209492   55610 serviceaccounts_controller.go:115] Starting service account controller
W1207 02:21:55.215] I1207 02:21:55.209531   55610 controller_utils.go:1027] Waiting for caches to sync for service account controller
W1207 02:21:55.215] I1207 02:21:55.209735   55610 controllermanager.go:516] Started "cronjob"
W1207 02:21:55.215] W1207 02:21:55.209766   55610 controllermanager.go:508] Skipping "ttl-after-finished"
... skipping 5 lines ...
W1207 02:21:55.216] I1207 02:21:55.210898   55610 pvc_protection_controller.go:99] Starting PVC protection controller
W1207 02:21:55.216] I1207 02:21:55.211248   55610 controller_utils.go:1027] Waiting for caches to sync for PVC protection controller
W1207 02:21:55.217] I1207 02:21:55.211828   55610 controllermanager.go:516] Started "horizontalpodautoscaling"
W1207 02:21:55.217] W1207 02:21:55.211854   55610 controllermanager.go:495] "tokencleaner" is disabled
W1207 02:21:55.217] I1207 02:21:55.212022   55610 horizontal.go:156] Starting HPA controller
W1207 02:21:55.217] I1207 02:21:55.212075   55610 controller_utils.go:1027] Waiting for caches to sync for HPA controller
W1207 02:21:55.217] W1207 02:21:55.212178   55610 garbagecollector.go:649] failed to discover preferred resources: the cache has not been filled yet
W1207 02:21:55.218] I1207 02:21:55.213040   55610 garbagecollector.go:133] Starting garbage collector controller
W1207 02:21:55.218] I1207 02:21:55.213073   55610 controller_utils.go:1027] Waiting for caches to sync for garbage collector controller
W1207 02:21:55.218] I1207 02:21:55.213052   55610 controllermanager.go:516] Started "garbagecollector"
W1207 02:21:55.218] I1207 02:21:55.213092   55610 graph_builder.go:308] GraphBuilder running
W1207 02:21:55.218] I1207 02:21:55.213805   55610 controllermanager.go:516] Started "job"
W1207 02:21:55.218] I1207 02:21:55.214108   55610 job_controller.go:143] Starting job controller
... skipping 21 lines ...
W1207 02:21:55.308] I1207 02:21:55.308168   55610 controller_utils.go:1034] Caches are synced for GC controller
W1207 02:21:55.310] I1207 02:21:55.309785   55610 controller_utils.go:1034] Caches are synced for service account controller
W1207 02:21:55.312] I1207 02:21:55.311660   55610 controller_utils.go:1034] Caches are synced for PVC protection controller
W1207 02:21:55.312] I1207 02:21:55.312292   55610 controller_utils.go:1034] Caches are synced for HPA controller
W1207 02:21:55.313] I1207 02:21:55.313213   52262 controller.go:608] quota admission added evaluator for: serviceaccounts
W1207 02:21:55.314] I1207 02:21:55.314316   55610 controller_utils.go:1034] Caches are synced for job controller
W1207 02:21:55.319] E1207 02:21:55.318487   55610 clusterroleaggregation_controller.go:180] view failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "view": the object has been modified; please apply your changes to the latest version and try again
W1207 02:21:55.320] E1207 02:21:55.318563   55610 clusterroleaggregation_controller.go:180] admin failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "admin": the object has been modified; please apply your changes to the latest version and try again
W1207 02:21:55.320] I1207 02:21:55.319346   55610 controller_utils.go:1034] Caches are synced for taint controller
W1207 02:21:55.320] I1207 02:21:55.319527   55610 taint_manager.go:198] Starting NoExecuteTaintManager
W1207 02:21:55.387] I1207 02:21:55.387067   55610 controller_utils.go:1034] Caches are synced for stateful set controller
W1207 02:21:55.490] I1207 02:21:55.489562   55610 controller_utils.go:1034] Caches are synced for persistent volume controller
W1207 02:21:55.491] I1207 02:21:55.490792   55610 controller_utils.go:1034] Caches are synced for attach detach controller
W1207 02:21:55.506] I1207 02:21:55.506041   55610 controller_utils.go:1034] Caches are synced for PV protection controller
... skipping 2 lines ...
W1207 02:21:55.620] I1207 02:21:55.619657   55610 controller_utils.go:1034] Caches are synced for daemon sets controller
I1207 02:21:55.721] +++ [1207 02:21:55] On try 3, controller-manager: ok
I1207 02:21:55.770] node/127.0.0.1 created
I1207 02:21:55.784] +++ [1207 02:21:55] Checking kubectl version
I1207 02:21:55.866] Client Version: version.Info{Major:"1", Minor:"14+", GitVersion:"v1.14.0-alpha.0.895+72d903cf0d5c7d", GitCommit:"72d903cf0d5c7db013174728826fe240ae727556", GitTreeState:"clean", BuildDate:"2018-12-07T02:19:56Z", GoVersion:"go1.11.1", Compiler:"gc", Platform:"linux/amd64"}
I1207 02:21:55.866] Server Version: version.Info{Major:"1", Minor:"14+", GitVersion:"v1.14.0-alpha.0.895+72d903cf0d5c7d", GitCommit:"72d903cf0d5c7db013174728826fe240ae727556", GitTreeState:"clean", BuildDate:"2018-12-07T02:20:16Z", GoVersion:"go1.11.1", Compiler:"gc", Platform:"linux/amd64"}
W1207 02:21:55.967] W1207 02:21:55.771623   55610 actual_state_of_world.go:491] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="127.0.0.1" does not exist
W1207 02:21:56.235] The Service "kubernetes" is invalid: spec.clusterIP: Invalid value: "10.0.0.1": provided IP is already allocated
I1207 02:21:56.335] NAME         TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
I1207 02:21:56.336] kubernetes   ClusterIP   10.0.0.1     <none>        443/TCP   35s
I1207 02:21:56.338] Recording: run_kubectl_version_tests
I1207 02:21:56.338] Running command: run_kubectl_version_tests
I1207 02:21:56.361] 
... skipping 10 lines ...
I1207 02:21:56.455]   "buildDate": "2018-12-07T02:20:16Z",
I1207 02:21:56.455]   "goVersion": "go1.11.1",
I1207 02:21:56.455]   "compiler": "gc",
I1207 02:21:56.455]   "platform": "linux/amd64"
I1207 02:21:56.632] }+++ [1207 02:21:56] Testing kubectl version: check client only output matches expected output
W1207 02:21:56.733] I1207 02:21:56.670240   55610 controller_utils.go:1027] Waiting for caches to sync for garbage collector controller
W1207 02:21:56.734] E1207 02:21:56.695275   55610 resource_quota_controller.go:437] failed to sync resource monitors: couldn't start monitor for resource "extensions/v1beta1, Resource=networkpolicies": unable to monitor quota for resource "extensions/v1beta1, Resource=networkpolicies"
W1207 02:21:56.734] I1207 02:21:56.713343   55610 controller_utils.go:1034] Caches are synced for garbage collector controller
W1207 02:21:56.734] I1207 02:21:56.713398   55610 garbagecollector.go:142] Garbage collector: all resource monitors have synced. Proceeding to collect garbage
W1207 02:21:56.771] I1207 02:21:56.770743   55610 controller_utils.go:1034] Caches are synced for garbage collector controller
I1207 02:21:56.872] Successful: the flag '--client' shows correct client info
I1207 02:21:56.872] (BSuccessful: the flag '--client' correctly has no server version info
I1207 02:21:56.872] (B+++ [1207 02:21:56] Testing kubectl version: verify json output
... skipping 57 lines ...
I1207 02:22:00.411] namespace/namespace-1544149320-22876 created
I1207 02:22:00.495] Context "test" modified.
I1207 02:22:00.503] +++ [1207 02:22:00] Testing RESTMapper
W1207 02:22:00.603] I1207 02:22:00.319917   55610 node_lifecycle_controller.go:1222] Initializing eviction metric for zone: 
W1207 02:22:00.604] I1207 02:22:00.320013   55610 node_lifecycle_controller.go:1072] Controller detected that all Nodes are not-Ready. Entering master disruption mode.
W1207 02:22:00.604] I1207 02:22:00.320396   55610 event.go:221] Event(v1.ObjectReference{Kind:"Node", Namespace:"", Name:"127.0.0.1", UID:"ddf8300d-f9c6-11e8-b772-0242ac110002", APIVersion:"", ResourceVersion:"", FieldPath:""}): type: 'Normal' reason: 'RegisteredNode' Node 127.0.0.1 event: Registered Node 127.0.0.1 in Controller
I1207 02:22:00.705] +++ [1207 02:22:00] "kubectl get unknownresourcetype" returns error as expected: error: the server doesn't have a resource type "unknownresourcetype"
I1207 02:22:00.705] +++ exit code: 0
I1207 02:22:00.821] NAME                              SHORTNAMES   APIGROUP                       NAMESPACED   KIND
I1207 02:22:00.822] bindings                                                                      true         Binding
I1207 02:22:00.822] componentstatuses                 cs                                          false        ComponentStatus
I1207 02:22:00.823] configmaps                        cm                                          true         ConfigMap
I1207 02:22:00.823] endpoints                         ep                                          true         Endpoints
... skipping 606 lines ...
I1207 02:22:21.881] (Bpoddisruptionbudget.policy/test-pdb-3 created
I1207 02:22:21.985] core.sh:251: Successful get pdb/test-pdb-3 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 2
I1207 02:22:22.059] (Bpoddisruptionbudget.policy/test-pdb-4 created
I1207 02:22:22.159] core.sh:255: Successful get pdb/test-pdb-4 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 50%
I1207 02:22:22.326] (Bcore.sh:261: Successful get pods --namespace=test-kubectl-describe-pod {{range.items}}{{.metadata.name}}:{{end}}: 
I1207 02:22:22.504] (Bpod/env-test-pod created
W1207 02:22:22.605] error: resource(s) were provided, but no name, label selector, or --all flag specified
W1207 02:22:22.605] error: setting 'all' parameter but found a non empty selector. 
W1207 02:22:22.605] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
W1207 02:22:22.606] I1207 02:22:21.527835   52262 controller.go:608] quota admission added evaluator for: poddisruptionbudgets.policy
W1207 02:22:22.606] error: min-available and max-unavailable cannot be both specified
I1207 02:22:22.718] core.sh:264: Successful describe pods --namespace=test-kubectl-describe-pod env-test-pod:
I1207 02:22:22.718] Name:               env-test-pod
I1207 02:22:22.718] Namespace:          test-kubectl-describe-pod
I1207 02:22:22.719] Priority:           0
I1207 02:22:22.719] PriorityClassName:  <none>
I1207 02:22:22.719] Node:               <none>
... skipping 145 lines ...
I1207 02:22:35.146] (Bservice "modified" deleted
I1207 02:22:35.242] replicationcontroller "modified" deleted
I1207 02:22:35.513] core.sh:434: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I1207 02:22:35.668] (Bpod/valid-pod created
I1207 02:22:35.772] core.sh:438: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I1207 02:22:35.939] (BSuccessful
I1207 02:22:35.939] message:Error from server: cannot restore map from string
I1207 02:22:35.939] has:cannot restore map from string
I1207 02:22:36.038] Successful
I1207 02:22:36.038] message:pod/valid-pod patched (no change)
I1207 02:22:36.039] has:patched (no change)
W1207 02:22:36.139] E1207 02:22:35.931068   52262 status.go:64] apiserver received an error that is not an metav1.Status: &errors.errorString{s:"cannot restore map from string"}
I1207 02:22:36.240] pod/valid-pod patched
I1207 02:22:36.248] core.sh:455: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
I1207 02:22:36.351] (Bcore.sh:457: Successful get pods {{range.items}}{{.metadata.annotations}}:{{end}}: map[kubernetes.io/change-cause:kubectl patch pod valid-pod --server=http://127.0.0.1:8080 --match-server-version=true --record=true --patch={"spec":{"containers":[{"name": "kubernetes-serve-hostname", "image": "nginx"}]}}]:
I1207 02:22:36.443] (Bpod/valid-pod patched
I1207 02:22:36.561] core.sh:461: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx2:
I1207 02:22:36.657] (Bpod/valid-pod patched
I1207 02:22:36.768] core.sh:465: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
I1207 02:22:36.860] (Bpod/valid-pod patched
I1207 02:22:36.964] core.sh:470: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: changed-with-yaml:
I1207 02:22:37.049] (Bpod/valid-pod patched
I1207 02:22:37.160] core.sh:475: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:3.1:
I1207 02:22:37.337] (Bpod/valid-pod patched
I1207 02:22:37.440] core.sh:491: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
I1207 02:22:37.628] (B+++ [1207 02:22:37] "kubectl patch with resourceVersion 490" returns error as expected: Error from server (Conflict): Operation cannot be fulfilled on pods "valid-pod": the object has been modified; please apply your changes to the latest version and try again
I1207 02:22:37.894] pod "valid-pod" deleted
I1207 02:22:37.909] pod/valid-pod replaced
I1207 02:22:38.016] core.sh:515: Successful get pod valid-pod {{(index .spec.containers 0).name}}: replaced-k8s-serve-hostname
I1207 02:22:38.168] (BSuccessful
I1207 02:22:38.168] message:error: --grace-period must have --force specified
I1207 02:22:38.169] has:\-\-grace-period must have \-\-force specified
I1207 02:22:38.330] Successful
I1207 02:22:38.331] message:error: --timeout must have --force specified
I1207 02:22:38.331] has:\-\-timeout must have \-\-force specified
W1207 02:22:38.492] W1207 02:22:38.492288   55610 actual_state_of_world.go:491] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="node-v1-test" does not exist
I1207 02:22:38.593] node/node-v1-test created
I1207 02:22:38.656] node/node-v1-test replaced
I1207 02:22:38.759] core.sh:552: Successful get node node-v1-test {{.metadata.annotations.a}}: b
I1207 02:22:38.847] (Bnode "node-v1-test" deleted
I1207 02:22:38.958] core.sh:559: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
I1207 02:22:39.260] (Bcore.sh:562: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: k8s.gcr.io/serve_hostname:
... skipping 20 lines ...
I1207 02:22:41.117] (Bcore.sh:601: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I1207 02:22:41.224] (Bpod "valid-pod" force deleted
W1207 02:22:41.325] Edit cancelled, no changes made.
W1207 02:22:41.325] Edit cancelled, no changes made.
W1207 02:22:41.325] Edit cancelled, no changes made.
W1207 02:22:41.325] Edit cancelled, no changes made.
W1207 02:22:41.325] error: 'name' already has a value (valid-pod), and --overwrite is false
W1207 02:22:41.326] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
I1207 02:22:41.426] core.sh:605: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I1207 02:22:41.427] (B+++ [1207 02:22:41] Creating namespace namespace-1544149361-6335
I1207 02:22:41.427] namespace/namespace-1544149361-6335 created
I1207 02:22:41.496] Context "test" modified.
I1207 02:22:41.594] core.sh:610: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
... skipping 79 lines ...
I1207 02:22:48.657] +++ Running case: test-cmd.run_kubectl_create_error_tests 
I1207 02:22:48.659] +++ working dir: /go/src/k8s.io/kubernetes
I1207 02:22:48.662] +++ command: run_kubectl_create_error_tests
I1207 02:22:48.677] +++ [1207 02:22:48] Creating namespace namespace-1544149368-16843
I1207 02:22:48.757] namespace/namespace-1544149368-16843 created
I1207 02:22:48.833] Context "test" modified.
I1207 02:22:48.841] +++ [1207 02:22:48] Testing kubectl create with error
W1207 02:22:48.942] Error: required flag(s) "filename" not set
W1207 02:22:48.942] 
W1207 02:22:48.942] 
W1207 02:22:48.942] Examples:
W1207 02:22:48.942]   # Create a pod using the data in pod.json.
W1207 02:22:48.943]   kubectl create -f ./pod.json
W1207 02:22:48.943]   
... skipping 38 lines ...
W1207 02:22:48.947]   kubectl create -f FILENAME [options]
W1207 02:22:48.948] 
W1207 02:22:48.948] Use "kubectl <command> --help" for more information about a given command.
W1207 02:22:48.948] Use "kubectl options" for a list of global command-line options (applies to all commands).
W1207 02:22:48.948] 
W1207 02:22:48.948] required flag(s) "filename" not set
I1207 02:22:49.084] +++ [1207 02:22:49] "kubectl create with empty string list returns error as expected: error: error validating "hack/testdata/invalid-rc-with-empty-args.yaml": error validating data: ValidationError(ReplicationController.spec.template.spec.containers[0].args): unknown object type "nil" in ReplicationController.spec.template.spec.containers[0].args[0]; if you choose to ignore these errors, turn validation off with --validate=false
W1207 02:22:49.185] kubectl convert is DEPRECATED and will be removed in a future version.
W1207 02:22:49.185] In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
I1207 02:22:49.286] +++ exit code: 0
I1207 02:22:49.321] Recording: run_kubectl_apply_tests
I1207 02:22:49.321] Running command: run_kubectl_apply_tests
I1207 02:22:49.345] 
... skipping 21 lines ...
W1207 02:22:51.639] I1207 02:22:51.066301   55610 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544149369-24506", Name:"test-deployment-retainkeys", UID:"fe818761-f9c6-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"498", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-deployment-retainkeys-7495cff5f to 1
W1207 02:22:51.639] I1207 02:22:51.069920   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149369-24506", Name:"test-deployment-retainkeys-7495cff5f", UID:"feedb12e-f9c6-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"502", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-deployment-retainkeys-7495cff5f-4b5jx
I1207 02:22:51.740] apply.sh:67: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I1207 02:22:51.816] (Bpod/selector-test-pod created
I1207 02:22:51.927] apply.sh:71: Successful get pods selector-test-pod {{.metadata.labels.name}}: selector-test-pod
I1207 02:22:52.035] (BSuccessful
I1207 02:22:52.036] message:Error from server (NotFound): pods "selector-test-pod-dont-apply" not found
I1207 02:22:52.036] has:pods "selector-test-pod-dont-apply" not found
I1207 02:22:52.125] pod "selector-test-pod" deleted
I1207 02:22:52.235] apply.sh:80: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I1207 02:22:52.453] (Bpod/test-pod created (server dry run)
I1207 02:22:52.548] apply.sh:85: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I1207 02:22:52.708] (Bpod/test-pod created
... skipping 4 lines ...
W1207 02:22:53.508] I1207 02:22:53.508268   52262 clientconn.go:551] parsed scheme: ""
W1207 02:22:53.509] I1207 02:22:53.508305   52262 clientconn.go:557] scheme "" not registered, fallback to default scheme
W1207 02:22:53.509] I1207 02:22:53.508353   52262 resolver_conn_wrapper.go:116] ccResolverWrapper: sending new addresses to cc: [{127.0.0.1:2379 0  <nil>}]
W1207 02:22:53.509] I1207 02:22:53.508404   52262 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1207 02:22:53.510] I1207 02:22:53.509192   52262 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1207 02:22:53.583] I1207 02:22:53.582896   52262 controller.go:608] quota admission added evaluator for: resources.mygroup.example.com
W1207 02:22:53.671] Error from server (NotFound): resources.mygroup.example.com "myobj" not found
I1207 02:22:53.772] kind.mygroup.example.com/myobj created (server dry run)
I1207 02:22:53.773] customresourcedefinition.apiextensions.k8s.io "resources.mygroup.example.com" deleted
I1207 02:22:53.855] apply.sh:129: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I1207 02:22:54.017] (Bpod/a created
I1207 02:22:55.518] apply.sh:134: Successful get pods a {{.metadata.name}}: a
I1207 02:22:55.602] (BSuccessful
I1207 02:22:55.602] message:Error from server (NotFound): pods "b" not found
I1207 02:22:55.602] has:pods "b" not found
I1207 02:22:55.761] pod/b created
I1207 02:22:55.776] pod/a pruned
I1207 02:22:57.473] apply.sh:142: Successful get pods b {{.metadata.name}}: b
I1207 02:22:57.560] (BSuccessful
I1207 02:22:57.560] message:Error from server (NotFound): pods "a" not found
I1207 02:22:57.561] has:pods "a" not found
I1207 02:22:57.638] pod "b" deleted
I1207 02:22:57.732] apply.sh:152: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I1207 02:22:57.881] (Bpod/a created
I1207 02:22:57.976] apply.sh:157: Successful get pods a {{.metadata.name}}: a
I1207 02:22:58.072] (BSuccessful
I1207 02:22:58.073] message:Error from server (NotFound): pods "b" not found
I1207 02:22:58.073] has:pods "b" not found
I1207 02:22:58.234] pod/b created
I1207 02:22:58.332] apply.sh:165: Successful get pods a {{.metadata.name}}: a
I1207 02:22:58.431] (Bapply.sh:166: Successful get pods b {{.metadata.name}}: b
I1207 02:22:58.516] (Bpod "a" deleted
I1207 02:22:58.522] pod "b" deleted
I1207 02:22:58.680] Successful
I1207 02:22:58.680] message:error: all resources selected for prune without explicitly passing --all. To prune all resources, pass the --all flag. If you did not mean to prune all resources, specify a label selector
I1207 02:22:58.680] has:all resources selected for prune without explicitly passing --all
I1207 02:22:58.833] pod/a created
I1207 02:22:58.842] pod/b created
I1207 02:22:58.852] service/prune-svc created
I1207 02:23:00.360] apply.sh:178: Successful get pods a {{.metadata.name}}: a
I1207 02:23:00.447] (Bapply.sh:179: Successful get pods b {{.metadata.name}}: b
... skipping 138 lines ...
I1207 02:23:13.614] Context "test" modified.
I1207 02:23:13.622] +++ [1207 02:23:13] Testing kubectl create filter
I1207 02:23:13.733] create.sh:30: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I1207 02:23:13.960] (Bpod/selector-test-pod created
I1207 02:23:14.102] create.sh:34: Successful get pods selector-test-pod {{.metadata.labels.name}}: selector-test-pod
I1207 02:23:14.208] (BSuccessful
I1207 02:23:14.209] message:Error from server (NotFound): pods "selector-test-pod-dont-apply" not found
I1207 02:23:14.209] has:pods "selector-test-pod-dont-apply" not found
I1207 02:23:14.324] pod "selector-test-pod" deleted
I1207 02:23:14.343] +++ exit code: 0
I1207 02:23:14.380] Recording: run_kubectl_apply_deployments_tests
I1207 02:23:14.380] Running command: run_kubectl_apply_deployments_tests
I1207 02:23:14.400] 
... skipping 37 lines ...
W1207 02:23:17.761] I1207 02:23:17.663104   55610 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544149394-429", Name:"nginx", UID:"0ec7485f-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"697", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-5d56d6b95f to 3
W1207 02:23:17.762] I1207 02:23:17.668348   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149394-429", Name:"nginx-5d56d6b95f", UID:"0ec7f226-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"698", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-5d56d6b95f-fjtj4
W1207 02:23:17.763] I1207 02:23:17.671988   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149394-429", Name:"nginx-5d56d6b95f", UID:"0ec7f226-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"698", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-5d56d6b95f-f2zj8
W1207 02:23:17.763] I1207 02:23:17.673226   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149394-429", Name:"nginx-5d56d6b95f", UID:"0ec7f226-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"698", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-5d56d6b95f-x2w7g
I1207 02:23:17.864] apps.sh:147: Successful get deployment nginx {{.metadata.name}}: nginx
I1207 02:23:22.082] (BSuccessful
I1207 02:23:22.082] message:Error from server (Conflict): error when applying patch:
I1207 02:23:22.083] {"metadata":{"annotations":{"kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"extensions/v1beta1\",\"kind\":\"Deployment\",\"metadata\":{\"annotations\":{},\"labels\":{\"name\":\"nginx\"},\"name\":\"nginx\",\"namespace\":\"namespace-1544149394-429\",\"resourceVersion\":\"99\"},\"spec\":{\"replicas\":3,\"selector\":{\"matchLabels\":{\"name\":\"nginx2\"}},\"template\":{\"metadata\":{\"labels\":{\"name\":\"nginx2\"}},\"spec\":{\"containers\":[{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"nginx\",\"ports\":[{\"containerPort\":80}]}]}}}}\n"},"resourceVersion":"99"},"spec":{"selector":{"matchLabels":{"name":"nginx2"}},"template":{"metadata":{"labels":{"name":"nginx2"}}}}}
I1207 02:23:22.083] to:
I1207 02:23:22.083] Resource: "extensions/v1beta1, Resource=deployments", GroupVersionKind: "extensions/v1beta1, Kind=Deployment"
I1207 02:23:22.083] Name: "nginx", Namespace: "namespace-1544149394-429"
I1207 02:23:22.085] Object: &{map["status":map["observedGeneration":'\x01' "replicas":'\x03' "updatedReplicas":'\x03' "unavailableReplicas":'\x03' "conditions":[map["message":"Deployment does not have minimum availability." "type":"Available" "status":"False" "lastUpdateTime":"2018-12-07T02:23:17Z" "lastTransitionTime":"2018-12-07T02:23:17Z" "reason":"MinimumReplicasUnavailable"]]] "kind":"Deployment" "apiVersion":"extensions/v1beta1" "metadata":map["namespace":"namespace-1544149394-429" "uid":"0ec7485f-f9c7-11e8-b772-0242ac110002" "resourceVersion":"710" "generation":'\x01' "creationTimestamp":"2018-12-07T02:23:17Z" "name":"nginx" "selfLink":"/apis/extensions/v1beta1/namespaces/namespace-1544149394-429/deployments/nginx" "labels":map["name":"nginx"] "annotations":map["deployment.kubernetes.io/revision":"1" "kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"extensions/v1beta1\",\"kind\":\"Deployment\",\"metadata\":{\"annotations\":{},\"labels\":{\"name\":\"nginx\"},\"name\":\"nginx\",\"namespace\":\"namespace-1544149394-429\"},\"spec\":{\"replicas\":3,\"template\":{\"metadata\":{\"labels\":{\"name\":\"nginx1\"}},\"spec\":{\"containers\":[{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"nginx\",\"ports\":[{\"containerPort\":80}]}]}}}}\n"]] "spec":map["progressDeadlineSeconds":%!q(int64=+2147483647) "replicas":'\x03' "selector":map["matchLabels":map["name":"nginx1"]] "template":map["metadata":map["creationTimestamp":<nil> "labels":map["name":"nginx1"]] "spec":map["containers":[map["terminationMessagePolicy":"File" "imagePullPolicy":"IfNotPresent" "name":"nginx" "image":"k8s.gcr.io/nginx:test-cmd" "ports":[map["containerPort":'P' "protocol":"TCP"]] "resources":map[] "terminationMessagePath":"/dev/termination-log"]] "restartPolicy":"Always" "terminationGracePeriodSeconds":'\x1e' "dnsPolicy":"ClusterFirst" "securityContext":map[] "schedulerName":"default-scheduler"]] "strategy":map["type":"RollingUpdate" "rollingUpdate":map["maxUnavailable":'\x01' "maxSurge":'\x01']] "revisionHistoryLimit":%!q(int64=+2147483647)]]}
I1207 02:23:22.085] for: "hack/testdata/deployment-label-change2.yaml": Operation cannot be fulfilled on deployments.extensions "nginx": the object has been modified; please apply your changes to the latest version and try again
I1207 02:23:22.086] has:Error from server (Conflict)
I1207 02:23:27.359] deployment.extensions/nginx configured
W1207 02:23:27.460] I1207 02:23:27.363185   55610 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544149394-429", Name:"nginx", UID:"148f85bd-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"733", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-7777658b9d to 3
W1207 02:23:27.461] I1207 02:23:27.371767   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149394-429", Name:"nginx-7777658b9d", UID:"149033c5-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"734", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-7777658b9d-9q6h9
W1207 02:23:27.462] I1207 02:23:27.377715   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149394-429", Name:"nginx-7777658b9d", UID:"149033c5-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"734", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-7777658b9d-lqt67
W1207 02:23:27.463] I1207 02:23:27.378256   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149394-429", Name:"nginx-7777658b9d", UID:"149033c5-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"734", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-7777658b9d-bhlpm
I1207 02:23:27.564] Successful
I1207 02:23:27.564] message:        "name": "nginx2"
I1207 02:23:27.565]           "name": "nginx2"
I1207 02:23:27.565] has:"name": "nginx2"
W1207 02:23:31.981] E1207 02:23:31.980474   55610 replica_set.go:450] Sync "namespace-1544149394-429/nginx-7777658b9d" failed with replicasets.apps "nginx-7777658b9d" not found
I1207 02:23:32.941] Successful
I1207 02:23:32.943] message:The Deployment "nginx" is invalid: spec.template.metadata.labels: Invalid value: map[string]string{"name":"nginx3"}: `selector` does not match template `labels`
I1207 02:23:32.943] has:Invalid value
W1207 02:23:33.044] I1207 02:23:32.952470   55610 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544149394-429", Name:"nginx", UID:"17e07538-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"768", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-7777658b9d to 3
W1207 02:23:33.045] I1207 02:23:32.961214   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149394-429", Name:"nginx-7777658b9d", UID:"17e189ec-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"769", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-7777658b9d-nmcb2
W1207 02:23:33.047] I1207 02:23:32.973877   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149394-429", Name:"nginx-7777658b9d", UID:"17e189ec-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"769", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-7777658b9d-gxt8d
... skipping 73 lines ...
I1207 02:23:35.843] +++ [1207 02:23:35] Creating namespace namespace-1544149415-766
I1207 02:23:35.925] namespace/namespace-1544149415-766 created
I1207 02:23:35.998] Context "test" modified.
I1207 02:23:36.005] +++ [1207 02:23:36] Testing kubectl get
I1207 02:23:36.102] get.sh:29: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I1207 02:23:36.193] (BSuccessful
I1207 02:23:36.193] message:Error from server (NotFound): pods "abc" not found
I1207 02:23:36.193] has:pods "abc" not found
I1207 02:23:36.290] get.sh:37: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I1207 02:23:36.378] (BSuccessful
I1207 02:23:36.379] message:Error from server (NotFound): pods "abc" not found
I1207 02:23:36.379] has:pods "abc" not found
I1207 02:23:36.486] get.sh:45: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I1207 02:23:36.573] (BSuccessful
I1207 02:23:36.573] message:{
I1207 02:23:36.573]     "apiVersion": "v1",
I1207 02:23:36.573]     "items": [],
... skipping 23 lines ...
I1207 02:23:36.949] has not:No resources found
I1207 02:23:37.039] Successful
I1207 02:23:37.039] message:NAME
I1207 02:23:37.039] has not:No resources found
I1207 02:23:37.129] get.sh:73: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I1207 02:23:37.262] (BSuccessful
I1207 02:23:37.262] message:error: the server doesn't have a resource type "foobar"
I1207 02:23:37.262] has not:No resources found
I1207 02:23:37.357] Successful
I1207 02:23:37.357] message:No resources found.
I1207 02:23:37.358] has:No resources found
I1207 02:23:37.454] Successful
I1207 02:23:37.454] message:
I1207 02:23:37.454] has not:No resources found
I1207 02:23:37.556] Successful
I1207 02:23:37.556] message:No resources found.
I1207 02:23:37.556] has:No resources found
I1207 02:23:37.645] get.sh:93: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I1207 02:23:37.736] (BSuccessful
I1207 02:23:37.736] message:Error from server (NotFound): pods "abc" not found
I1207 02:23:37.736] has:pods "abc" not found
I1207 02:23:37.738] FAIL!
I1207 02:23:37.738] message:Error from server (NotFound): pods "abc" not found
I1207 02:23:37.738] has not:List
I1207 02:23:37.739] 99 /go/src/k8s.io/kubernetes/test/cmd/../../test/cmd/get.sh
I1207 02:23:37.876] Successful
I1207 02:23:37.877] message:I1207 02:23:37.814830   67704 loader.go:359] Config loaded from file /tmp/tmp.G1c9u0Ih7Y/.kube/config
I1207 02:23:37.877] I1207 02:23:37.815383   67704 loader.go:359] Config loaded from file /tmp/tmp.G1c9u0Ih7Y/.kube/config
I1207 02:23:37.877] I1207 02:23:37.816838   67704 round_trippers.go:438] GET http://127.0.0.1:8080/version?timeout=32s 200 OK in 1 milliseconds
... skipping 995 lines ...
I1207 02:23:41.535] }
I1207 02:23:41.639] get.sh:155: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I1207 02:23:41.940] (B<no value>Successful
I1207 02:23:41.940] message:valid-pod:
I1207 02:23:41.940] has:valid-pod:
I1207 02:23:42.029] Successful
I1207 02:23:42.029] message:error: error executing jsonpath "{.missing}": Error executing template: missing is not found. Printing more information for debugging the template:
I1207 02:23:42.030] 	template was:
I1207 02:23:42.030] 		{.missing}
I1207 02:23:42.030] 	object given to jsonpath engine was:
I1207 02:23:42.031] 		map[string]interface {}{"kind":"Pod", "apiVersion":"v1", "metadata":map[string]interface {}{"uid":"1cf3c744-f9c7-11e8-b772-0242ac110002", "resourceVersion":"806", "creationTimestamp":"2018-12-07T02:23:41Z", "labels":map[string]interface {}{"name":"valid-pod"}, "name":"valid-pod", "namespace":"namespace-1544149421-31015", "selfLink":"/api/v1/namespaces/namespace-1544149421-31015/pods/valid-pod"}, "spec":map[string]interface {}{"priority":0, "enableServiceLinks":true, "containers":[]interface {}{map[string]interface {}{"image":"k8s.gcr.io/serve_hostname", "resources":map[string]interface {}{"requests":map[string]interface {}{"cpu":"1", "memory":"512Mi"}, "limits":map[string]interface {}{"cpu":"1", "memory":"512Mi"}}, "terminationMessagePath":"/dev/termination-log", "terminationMessagePolicy":"File", "imagePullPolicy":"Always", "name":"kubernetes-serve-hostname"}}, "restartPolicy":"Always", "terminationGracePeriodSeconds":30, "dnsPolicy":"ClusterFirst", "securityContext":map[string]interface {}{}, "schedulerName":"default-scheduler"}, "status":map[string]interface {}{"phase":"Pending", "qosClass":"Guaranteed"}}
I1207 02:23:42.031] has:missing is not found
I1207 02:23:42.112] Successful
I1207 02:23:42.113] message:Error executing template: template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing". Printing more information for debugging the template:
I1207 02:23:42.113] 	template was:
I1207 02:23:42.113] 		{{.missing}}
I1207 02:23:42.113] 	raw data was:
I1207 02:23:42.114] 		{"apiVersion":"v1","kind":"Pod","metadata":{"creationTimestamp":"2018-12-07T02:23:41Z","labels":{"name":"valid-pod"},"name":"valid-pod","namespace":"namespace-1544149421-31015","resourceVersion":"806","selfLink":"/api/v1/namespaces/namespace-1544149421-31015/pods/valid-pod","uid":"1cf3c744-f9c7-11e8-b772-0242ac110002"},"spec":{"containers":[{"image":"k8s.gcr.io/serve_hostname","imagePullPolicy":"Always","name":"kubernetes-serve-hostname","resources":{"limits":{"cpu":"1","memory":"512Mi"},"requests":{"cpu":"1","memory":"512Mi"}},"terminationMessagePath":"/dev/termination-log","terminationMessagePolicy":"File"}],"dnsPolicy":"ClusterFirst","enableServiceLinks":true,"priority":0,"restartPolicy":"Always","schedulerName":"default-scheduler","securityContext":{},"terminationGracePeriodSeconds":30},"status":{"phase":"Pending","qosClass":"Guaranteed"}}
I1207 02:23:42.114] 	object given to template engine was:
I1207 02:23:42.115] 		map[metadata:map[namespace:namespace-1544149421-31015 resourceVersion:806 selfLink:/api/v1/namespaces/namespace-1544149421-31015/pods/valid-pod uid:1cf3c744-f9c7-11e8-b772-0242ac110002 creationTimestamp:2018-12-07T02:23:41Z labels:map[name:valid-pod] name:valid-pod] spec:map[restartPolicy:Always schedulerName:default-scheduler securityContext:map[] terminationGracePeriodSeconds:30 containers:[map[image:k8s.gcr.io/serve_hostname imagePullPolicy:Always name:kubernetes-serve-hostname resources:map[limits:map[cpu:1 memory:512Mi] requests:map[cpu:1 memory:512Mi]] terminationMessagePath:/dev/termination-log terminationMessagePolicy:File]] dnsPolicy:ClusterFirst enableServiceLinks:true priority:0] status:map[phase:Pending qosClass:Guaranteed] apiVersion:v1 kind:Pod]
I1207 02:23:42.115] has:map has no entry for key "missing"
W1207 02:23:42.215] error: error executing template "{{.missing}}": template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing"
W1207 02:23:43.194] E1207 02:23:43.194050   68084 streamwatcher.go:109] Unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)
I1207 02:23:43.295] Successful
I1207 02:23:43.295] message:NAME        READY   STATUS    RESTARTS   AGE
I1207 02:23:43.296] valid-pod   0/1     Pending   0          1s
I1207 02:23:43.296] has:STATUS
I1207 02:23:43.296] Successful
... skipping 80 lines ...
I1207 02:23:45.537]   terminationGracePeriodSeconds: 30
I1207 02:23:45.537] status:
I1207 02:23:45.537]   phase: Pending
I1207 02:23:45.538]   qosClass: Guaranteed
I1207 02:23:45.538] has:name: valid-pod
I1207 02:23:45.540] Successful
I1207 02:23:45.540] message:Error from server (NotFound): pods "invalid-pod" not found
I1207 02:23:45.540] has:"invalid-pod" not found
I1207 02:23:45.634] pod "valid-pod" deleted
I1207 02:23:45.739] get.sh:193: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I1207 02:23:45.913] (Bpod/redis-master created
I1207 02:23:45.920] pod/valid-pod created
I1207 02:23:46.048] Successful
... skipping 312 lines ...
I1207 02:23:51.742] Running command: run_create_secret_tests
I1207 02:23:51.765] 
I1207 02:23:51.767] +++ Running case: test-cmd.run_create_secret_tests 
I1207 02:23:51.770] +++ working dir: /go/src/k8s.io/kubernetes
I1207 02:23:51.772] +++ command: run_create_secret_tests
I1207 02:23:51.892] Successful
I1207 02:23:51.892] message:Error from server (NotFound): secrets "mysecret" not found
I1207 02:23:51.893] has:secrets "mysecret" not found
I1207 02:23:52.140] Successful
I1207 02:23:52.141] message:Error from server (NotFound): secrets "mysecret" not found
I1207 02:23:52.142] has:secrets "mysecret" not found
I1207 02:23:52.143] Successful
I1207 02:23:52.143] message:user-specified
I1207 02:23:52.143] has:user-specified
I1207 02:23:52.251] Successful
I1207 02:23:52.358] {"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-create-cm","namespace":"default","selfLink":"/api/v1/namespaces/default/configmaps/tester-create-cm","uid":"2375a162-f9c7-11e8-b772-0242ac110002","resourceVersion":"882","creationTimestamp":"2018-12-07T02:23:52Z"}}
... skipping 80 lines ...
I1207 02:23:54.735] has:Timeout exceeded while reading body
I1207 02:23:54.848] Successful
I1207 02:23:54.848] message:NAME        READY   STATUS    RESTARTS   AGE
I1207 02:23:54.848] valid-pod   0/1     Pending   0          1s
I1207 02:23:54.849] has:valid-pod
I1207 02:23:54.942] Successful
I1207 02:23:54.942] message:error: Invalid timeout value. Timeout must be a single integer in seconds, or an integer followed by a corresponding time unit (e.g. 1s | 2m | 3h)
I1207 02:23:54.942] has:Invalid timeout value
I1207 02:23:55.062] pod "valid-pod" deleted
I1207 02:23:55.083] +++ exit code: 0
I1207 02:23:55.121] Recording: run_crd_tests
I1207 02:23:55.121] Running command: run_crd_tests
I1207 02:23:55.143] 
... skipping 8 lines ...
I1207 02:23:55.797] crd.sh:47: Successful get customresourcedefinitions {{range.items}}{{if eq .metadata.name \"foos.company.com\"}}{{.metadata.name}}:{{end}}{{end}}: foos.company.com:
I1207 02:23:56.026] (Bcustomresourcedefinition.apiextensions.k8s.io/bars.company.com created
I1207 02:23:56.162] crd.sh:69: Successful get customresourcedefinitions {{range.items}}{{if eq .metadata.name \"foos.company.com\" \"bars.company.com\"}}{{.metadata.name}}:{{end}}{{end}}: bars.company.com:foos.company.com:
I1207 02:23:56.372] (Bcustomresourcedefinition.apiextensions.k8s.io/resources.mygroup.example.com created
I1207 02:23:56.513] crd.sh:96: Successful get customresourcedefinitions {{range.items}}{{if eq .metadata.name \"foos.company.com\" \"bars.company.com\" \"resources.mygroup.example.com\"}}{{.metadata.name}}:{{end}}{{end}}: bars.company.com:foos.company.com:resources.mygroup.example.com:
I1207 02:23:56.759] (Bcustomresourcedefinition.apiextensions.k8s.io/validfoos.company.com created
W1207 02:23:56.861] E1207 02:23:56.823669   55610 resource_quota_controller.go:437] failed to sync resource monitors: [couldn't start monitor for resource "company.com/v1, Resource=bars": unable to monitor quota for resource "company.com/v1, Resource=bars", couldn't start monitor for resource "company.com/v1, Resource=validfoos": unable to monitor quota for resource "company.com/v1, Resource=validfoos", couldn't start monitor for resource "company.com/v1, Resource=foos": unable to monitor quota for resource "company.com/v1, Resource=foos", couldn't start monitor for resource "mygroup.example.com/v1alpha1, Resource=resources": unable to monitor quota for resource "mygroup.example.com/v1alpha1, Resource=resources", couldn't start monitor for resource "extensions/v1beta1, Resource=networkpolicies": unable to monitor quota for resource "extensions/v1beta1, Resource=networkpolicies"]
I1207 02:23:56.962] crd.sh:131: Successful get customresourcedefinitions {{range.items}}{{if eq .metadata.name \"foos.company.com\" \"bars.company.com\" \"resources.mygroup.example.com\" \"validfoos.company.com\"}}{{.metadata.name}}:{{end}}{{end}}: bars.company.com:foos.company.com:resources.mygroup.example.com:validfoos.company.com:
I1207 02:23:56.962] (B+++ [1207 02:23:56] Creating namespace namespace-1544149436-5729
I1207 02:23:57.062] namespace/namespace-1544149436-5729 created
W1207 02:23:57.162] I1207 02:23:56.999375   55610 controller_utils.go:1027] Waiting for caches to sync for garbage collector controller
W1207 02:23:57.163] I1207 02:23:57.003851   52262 clientconn.go:551] parsed scheme: ""
W1207 02:23:57.163] I1207 02:23:57.004420   52262 clientconn.go:557] scheme "" not registered, fallback to default scheme
... skipping 154 lines ...
I1207 02:24:01.622] foo.company.com/test patched
I1207 02:24:01.728] crd.sh:237: Successful get foos/test {{.patched}}: value1
I1207 02:24:01.814] (Bfoo.company.com/test patched
I1207 02:24:01.922] crd.sh:239: Successful get foos/test {{.patched}}: value2
I1207 02:24:02.008] (Bfoo.company.com/test patched
I1207 02:24:02.113] crd.sh:241: Successful get foos/test {{.patched}}: <no value>
I1207 02:24:02.282] (B+++ [1207 02:24:02] "kubectl patch --local" returns error as expected for CustomResource: error: cannot apply strategic merge patch for company.com/v1, Kind=Foo locally, try --type merge
I1207 02:24:02.364] {
I1207 02:24:02.365]     "apiVersion": "company.com/v1",
I1207 02:24:02.365]     "kind": "Foo",
I1207 02:24:02.365]     "metadata": {
I1207 02:24:02.365]         "annotations": {
I1207 02:24:02.366]             "kubernetes.io/change-cause": "kubectl patch foos/test --server=http://127.0.0.1:8080 --match-server-version=true --patch={\"patched\":null} --type=merge --record=true"
... skipping 179 lines ...
W1207 02:24:09.838] /go/src/k8s.io/kubernetes/test/cmd/../../test/cmd/crd.sh: line 295: 70552 Killed                  kubectl "${kube_flags[@]}" get bars --request-timeout=1m --watch-only -o name
I1207 02:24:09.938] bar.company.com/test created
I1207 02:24:09.998] crd.sh:456: Successful get bars {{len .items}}: 1
I1207 02:24:10.077] (Bnamespace "non-native-resources" deleted
I1207 02:24:15.403] crd.sh:459: Successful get bars {{len .items}}: 0
I1207 02:24:15.595] (Bcustomresourcedefinition.apiextensions.k8s.io "foos.company.com" deleted
W1207 02:24:15.696] Error from server (NotFound): namespaces "non-native-resources" not found
I1207 02:24:15.799] customresourcedefinition.apiextensions.k8s.io "bars.company.com" deleted
I1207 02:24:15.814] customresourcedefinition.apiextensions.k8s.io "resources.mygroup.example.com" deleted
I1207 02:24:15.924] customresourcedefinition.apiextensions.k8s.io "validfoos.company.com" deleted
I1207 02:24:15.962] +++ exit code: 0
I1207 02:24:16.006] Recording: run_cmd_with_img_tests
I1207 02:24:16.007] Running command: run_cmd_with_img_tests
... skipping 7 lines ...
I1207 02:24:16.221] +++ [1207 02:24:16] Testing cmd with image
I1207 02:24:16.318] Successful
I1207 02:24:16.318] message:deployment.apps/test1 created
I1207 02:24:16.319] has:deployment.apps/test1 created
I1207 02:24:16.404] deployment.extensions "test1" deleted
I1207 02:24:16.489] Successful
I1207 02:24:16.490] message:error: Invalid image name "InvalidImageName": invalid reference format
I1207 02:24:16.490] has:error: Invalid image name "InvalidImageName": invalid reference format
I1207 02:24:16.507] +++ exit code: 0
I1207 02:24:16.546] Recording: run_recursive_resources_tests
I1207 02:24:16.546] Running command: run_recursive_resources_tests
I1207 02:24:16.576] 
I1207 02:24:16.580] +++ Running case: test-cmd.run_recursive_resources_tests 
I1207 02:24:16.583] +++ working dir: /go/src/k8s.io/kubernetes
... skipping 4 lines ...
I1207 02:24:16.764] Context "test" modified.
I1207 02:24:16.866] generic-resources.sh:202: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I1207 02:24:17.153] (Bgeneric-resources.sh:206: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I1207 02:24:17.156] (BSuccessful
I1207 02:24:17.156] message:pod/busybox0 created
I1207 02:24:17.156] pod/busybox1 created
I1207 02:24:17.157] error: error validating "hack/testdata/recursive/pod/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I1207 02:24:17.157] has:error validating data: kind not set
I1207 02:24:17.254] generic-resources.sh:211: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I1207 02:24:17.460] (Bgeneric-resources.sh:219: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: busybox:busybox:
I1207 02:24:17.463] (BSuccessful
I1207 02:24:17.464] message:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I1207 02:24:17.464] has:Object 'Kind' is missing
I1207 02:24:17.565] generic-resources.sh:226: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I1207 02:24:17.855] (Bgeneric-resources.sh:230: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced:
I1207 02:24:17.858] (BSuccessful
I1207 02:24:17.858] message:pod/busybox0 replaced
I1207 02:24:17.858] pod/busybox1 replaced
I1207 02:24:17.858] error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I1207 02:24:17.858] has:error validating data: kind not set
I1207 02:24:17.958] generic-resources.sh:235: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I1207 02:24:18.067] (BSuccessful
I1207 02:24:18.068] message:Name:               busybox0
I1207 02:24:18.068] Namespace:          namespace-1544149456-23559
I1207 02:24:18.068] Priority:           0
I1207 02:24:18.068] PriorityClassName:  <none>
... skipping 159 lines ...
I1207 02:24:18.088] has:Object 'Kind' is missing
I1207 02:24:18.178] generic-resources.sh:245: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I1207 02:24:18.392] (Bgeneric-resources.sh:249: Successful get pods {{range.items}}{{.metadata.annotations.annotatekey}}:{{end}}: annotatevalue:annotatevalue:
I1207 02:24:18.394] (BSuccessful
I1207 02:24:18.394] message:pod/busybox0 annotated
I1207 02:24:18.394] pod/busybox1 annotated
I1207 02:24:18.395] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I1207 02:24:18.395] has:Object 'Kind' is missing
I1207 02:24:18.496] generic-resources.sh:254: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I1207 02:24:18.825] (Bgeneric-resources.sh:258: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced:
I1207 02:24:18.828] (BSuccessful
I1207 02:24:18.828] message:Warning: kubectl apply should be used on resource created by either kubectl create --save-config or kubectl apply
I1207 02:24:18.828] pod/busybox0 configured
I1207 02:24:18.829] Warning: kubectl apply should be used on resource created by either kubectl create --save-config or kubectl apply
I1207 02:24:18.829] pod/busybox1 configured
I1207 02:24:18.829] error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I1207 02:24:18.829] has:error validating data: kind not set
W1207 02:24:18.929] kubectl run --generator=deployment/apps.v1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
W1207 02:24:18.930] I1207 02:24:16.308254   55610 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544149456-16213", Name:"test1", UID:"31bbf54d-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"988", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test1-fb488bd5d to 1
W1207 02:24:18.930] I1207 02:24:16.314254   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149456-16213", Name:"test1-fb488bd5d", UID:"31bca37a-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"989", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test1-fb488bd5d-gv7d6
I1207 02:24:19.031] generic-resources.sh:264: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
I1207 02:24:19.119] (Bdeployment.extensions/nginx created
W1207 02:24:19.220] I1207 02:24:19.122242   55610 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544149456-23559", Name:"nginx", UID:"33694297-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1013", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-6f6bb85d9c to 3
... skipping 51 lines ...
W1207 02:24:19.721] In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
I1207 02:24:19.822] generic-resources.sh:280: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I1207 02:24:19.938] (Bgeneric-resources.sh:284: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I1207 02:24:19.940] (BSuccessful
I1207 02:24:19.941] message:kubectl convert is DEPRECATED and will be removed in a future version.
I1207 02:24:19.941] In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
I1207 02:24:19.941] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I1207 02:24:19.941] has:Object 'Kind' is missing
I1207 02:24:20.047] generic-resources.sh:289: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I1207 02:24:20.141] (BSuccessful
I1207 02:24:20.141] message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I1207 02:24:20.141] has:busybox0:busybox1:
I1207 02:24:20.143] Successful
I1207 02:24:20.144] message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I1207 02:24:20.144] has:Object 'Kind' is missing
I1207 02:24:20.251] generic-resources.sh:298: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I1207 02:24:20.357] (Bpod/busybox0 labeled pod/busybox1 labeled error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I1207 02:24:20.463] generic-resources.sh:303: Successful get pods {{range.items}}{{.metadata.labels.mylabel}}:{{end}}: myvalue:myvalue:
I1207 02:24:20.466] (BSuccessful
I1207 02:24:20.466] message:pod/busybox0 labeled
I1207 02:24:20.466] pod/busybox1 labeled
I1207 02:24:20.467] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I1207 02:24:20.467] has:Object 'Kind' is missing
I1207 02:24:20.569] generic-resources.sh:308: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I1207 02:24:20.668] (Bpod/busybox0 patched pod/busybox1 patched error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I1207 02:24:20.771] generic-resources.sh:313: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: prom/busybox:prom/busybox:
I1207 02:24:20.773] (BSuccessful
I1207 02:24:20.774] message:pod/busybox0 patched
I1207 02:24:20.774] pod/busybox1 patched
I1207 02:24:20.774] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I1207 02:24:20.774] has:Object 'Kind' is missing
I1207 02:24:20.879] generic-resources.sh:318: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I1207 02:24:21.096] (Bgeneric-resources.sh:322: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I1207 02:24:21.098] (BSuccessful
I1207 02:24:21.099] message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
I1207 02:24:21.099] pod "busybox0" force deleted
I1207 02:24:21.099] pod "busybox1" force deleted
I1207 02:24:21.099] error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
I1207 02:24:21.100] has:Object 'Kind' is missing
I1207 02:24:21.193] generic-resources.sh:327: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
I1207 02:24:21.351] (Breplicationcontroller/busybox0 created
I1207 02:24:21.357] replicationcontroller/busybox1 created
W1207 02:24:21.458] I1207 02:24:20.283528   55610 namespace_controller.go:171] Namespace has been deleted non-native-resources
W1207 02:24:21.458] I1207 02:24:21.355350   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1544149456-23559", Name:"busybox0", UID:"34be161b-f9c7-11e8-b772-0242ac110002", APIVersion:"v1", ResourceVersion:"1045", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-mfhjc
W1207 02:24:21.459] error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
W1207 02:24:21.459] I1207 02:24:21.360223   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1544149456-23559", Name:"busybox1", UID:"34bef353-f9c7-11e8-b772-0242ac110002", APIVersion:"v1", ResourceVersion:"1047", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-g7c5p
I1207 02:24:21.559] generic-resources.sh:331: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I1207 02:24:21.573] (Bgeneric-resources.sh:336: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I1207 02:24:21.680] (Bgeneric-resources.sh:337: Successful get rc busybox0 {{.spec.replicas}}: 1
I1207 02:24:21.789] (Bgeneric-resources.sh:338: Successful get rc busybox1 {{.spec.replicas}}: 1
I1207 02:24:22.002] (Bgeneric-resources.sh:343: Successful get hpa busybox0 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80
I1207 02:24:22.103] (Bgeneric-resources.sh:344: Successful get hpa busybox1 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80
I1207 02:24:22.106] (BSuccessful
I1207 02:24:22.107] message:horizontalpodautoscaler.autoscaling/busybox0 autoscaled
I1207 02:24:22.107] horizontalpodautoscaler.autoscaling/busybox1 autoscaled
I1207 02:24:22.107] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I1207 02:24:22.107] has:Object 'Kind' is missing
I1207 02:24:22.197] horizontalpodautoscaler.autoscaling "busybox0" deleted
I1207 02:24:22.322] horizontalpodautoscaler.autoscaling "busybox1" deleted
I1207 02:24:22.437] generic-resources.sh:352: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I1207 02:24:22.541] (Bgeneric-resources.sh:353: Successful get rc busybox0 {{.spec.replicas}}: 1
I1207 02:24:22.638] (Bgeneric-resources.sh:354: Successful get rc busybox1 {{.spec.replicas}}: 1
I1207 02:24:23.068] (Bgeneric-resources.sh:358: Successful get service busybox0 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
I1207 02:24:23.069] (Bgeneric-resources.sh:359: Successful get service busybox1 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
I1207 02:24:23.069] (BSuccessful
I1207 02:24:23.069] message:service/busybox0 exposed
I1207 02:24:23.069] service/busybox1 exposed
I1207 02:24:23.069] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I1207 02:24:23.070] has:Object 'Kind' is missing
I1207 02:24:23.070] generic-resources.sh:365: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I1207 02:24:23.160] (Bgeneric-resources.sh:366: Successful get rc busybox0 {{.spec.replicas}}: 1
I1207 02:24:23.264] (Bgeneric-resources.sh:367: Successful get rc busybox1 {{.spec.replicas}}: 1
I1207 02:24:23.504] (Bgeneric-resources.sh:371: Successful get rc busybox0 {{.spec.replicas}}: 2
I1207 02:24:23.773] (Bgeneric-resources.sh:372: Successful get rc busybox1 {{.spec.replicas}}: 2
I1207 02:24:23.773] (BSuccessful
I1207 02:24:23.773] message:replicationcontroller/busybox0 scaled
I1207 02:24:23.773] replicationcontroller/busybox1 scaled
I1207 02:24:23.774] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I1207 02:24:23.774] has:Object 'Kind' is missing
I1207 02:24:23.821] generic-resources.sh:377: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I1207 02:24:24.016] (Bgeneric-resources.sh:381: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
I1207 02:24:24.019] (BSuccessful
I1207 02:24:24.019] message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
I1207 02:24:24.019] replicationcontroller "busybox0" force deleted
I1207 02:24:24.020] replicationcontroller "busybox1" force deleted
I1207 02:24:24.020] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I1207 02:24:24.020] has:Object 'Kind' is missing
I1207 02:24:24.118] generic-resources.sh:386: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
I1207 02:24:24.286] (Bdeployment.extensions/nginx1-deployment created
W1207 02:24:24.387] I1207 02:24:23.388321   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1544149456-23559", Name:"busybox0", UID:"34be161b-f9c7-11e8-b772-0242ac110002", APIVersion:"v1", ResourceVersion:"1066", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-v8h2r
W1207 02:24:24.388] I1207 02:24:23.398590   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1544149456-23559", Name:"busybox1", UID:"34bef353-f9c7-11e8-b772-0242ac110002", APIVersion:"v1", ResourceVersion:"1071", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-54qs8
W1207 02:24:24.388] I1207 02:24:24.290563   55610 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544149456-23559", Name:"nginx1-deployment", UID:"367dd0b6-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1087", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx1-deployment-75f6fc6747 to 2
W1207 02:24:24.389] I1207 02:24:24.295569   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149456-23559", Name:"nginx1-deployment-75f6fc6747", UID:"367e96a7-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1088", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx1-deployment-75f6fc6747-998j4
W1207 02:24:24.389] I1207 02:24:24.298589   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149456-23559", Name:"nginx1-deployment-75f6fc6747", UID:"367e96a7-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1088", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx1-deployment-75f6fc6747-m9hk9
W1207 02:24:24.414] error: error validating "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
W1207 02:24:24.417] I1207 02:24:24.416796   55610 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544149456-23559", Name:"nginx0-deployment", UID:"36912c6f-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1099", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx0-deployment-b6bb4ccbb to 2
W1207 02:24:24.421] I1207 02:24:24.420631   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149456-23559", Name:"nginx0-deployment-b6bb4ccbb", UID:"3691d70e-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1100", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx0-deployment-b6bb4ccbb-sf99r
W1207 02:24:24.424] I1207 02:24:24.423771   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149456-23559", Name:"nginx0-deployment-b6bb4ccbb", UID:"3691d70e-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1100", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx0-deployment-b6bb4ccbb-7thlt
I1207 02:24:24.525] deployment.extensions/nginx0-deployment created
I1207 02:24:24.752] generic-resources.sh:390: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx0-deployment:nginx1-deployment:
I1207 02:24:24.886] (Bgeneric-resources.sh:391: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:k8s.gcr.io/nginx:1.7.9:
I1207 02:24:25.128] (Bgeneric-resources.sh:395: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:k8s.gcr.io/nginx:1.7.9:
I1207 02:24:25.130] (BSuccessful
I1207 02:24:25.131] message:deployment.extensions/nginx1-deployment skipped rollback (current template already matches revision 1)
I1207 02:24:25.131] deployment.extensions/nginx0-deployment skipped rollback (current template already matches revision 1)
I1207 02:24:25.131] error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"extensions/v1beta1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
I1207 02:24:25.131] has:Object 'Kind' is missing
I1207 02:24:25.231] deployment.extensions/nginx1-deployment paused
I1207 02:24:25.236] deployment.extensions/nginx0-deployment paused
I1207 02:24:25.355] generic-resources.sh:402: Successful get deployment {{range.items}}{{.spec.paused}}:{{end}}: true:true:
I1207 02:24:25.358] (BSuccessful
I1207 02:24:25.358] message:unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"extensions/v1beta1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
... skipping 10 lines ...
I1207 02:24:25.762] 1         <none>
I1207 02:24:25.762] 
I1207 02:24:25.762] deployment.extensions/nginx0-deployment 
I1207 02:24:25.762] REVISION  CHANGE-CAUSE
I1207 02:24:25.762] 1         <none>
I1207 02:24:25.762] 
I1207 02:24:25.762] error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"extensions/v1beta1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
I1207 02:24:25.762] has:nginx0-deployment
I1207 02:24:25.764] Successful
I1207 02:24:25.764] message:deployment.extensions/nginx1-deployment 
I1207 02:24:25.764] REVISION  CHANGE-CAUSE
I1207 02:24:25.764] 1         <none>
I1207 02:24:25.765] 
I1207 02:24:25.765] deployment.extensions/nginx0-deployment 
I1207 02:24:25.765] REVISION  CHANGE-CAUSE
I1207 02:24:25.765] 1         <none>
I1207 02:24:25.765] 
I1207 02:24:25.765] error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"extensions/v1beta1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
I1207 02:24:25.766] has:nginx1-deployment
I1207 02:24:25.767] Successful
I1207 02:24:25.767] message:deployment.extensions/nginx1-deployment 
I1207 02:24:25.767] REVISION  CHANGE-CAUSE
I1207 02:24:25.767] 1         <none>
I1207 02:24:25.767] 
I1207 02:24:25.768] deployment.extensions/nginx0-deployment 
I1207 02:24:25.768] REVISION  CHANGE-CAUSE
I1207 02:24:25.768] 1         <none>
I1207 02:24:25.768] 
I1207 02:24:25.768] error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"extensions/v1beta1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
I1207 02:24:25.768] has:Object 'Kind' is missing
I1207 02:24:25.860] deployment.extensions "nginx1-deployment" force deleted
I1207 02:24:25.866] deployment.extensions "nginx0-deployment" force deleted
W1207 02:24:25.967] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
W1207 02:24:25.968] error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"extensions/v1beta1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
W1207 02:24:26.833] E1207 02:24:26.832238   55610 resource_quota_controller.go:437] failed to sync resource monitors: couldn't start monitor for resource "extensions/v1beta1, Resource=networkpolicies": unable to monitor quota for resource "extensions/v1beta1, Resource=networkpolicies"
I1207 02:24:26.972] generic-resources.sh:424: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
I1207 02:24:27.158] (Breplicationcontroller/busybox0 created
I1207 02:24:27.167] replicationcontroller/busybox1 created
W1207 02:24:27.268] I1207 02:24:27.137404   55610 controller_utils.go:1027] Waiting for caches to sync for garbage collector controller
W1207 02:24:27.268] I1207 02:24:27.162612   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1544149456-23559", Name:"busybox0", UID:"3834142e-f9c7-11e8-b772-0242ac110002", APIVersion:"v1", ResourceVersion:"1132", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-vqrbh
W1207 02:24:27.268] error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
W1207 02:24:27.269] I1207 02:24:27.172318   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1544149456-23559", Name:"busybox1", UID:"38357f57-f9c7-11e8-b772-0242ac110002", APIVersion:"v1", ResourceVersion:"1134", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-6k5dl
W1207 02:24:27.269] I1207 02:24:27.237925   55610 controller_utils.go:1034] Caches are synced for garbage collector controller
I1207 02:24:27.370] generic-resources.sh:428: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
I1207 02:24:27.392] (BSuccessful
I1207 02:24:27.392] message:no rollbacker has been implemented for "ReplicationController"
I1207 02:24:27.392] no rollbacker has been implemented for "ReplicationController"
... skipping 3 lines ...
I1207 02:24:27.395] message:no rollbacker has been implemented for "ReplicationController"
I1207 02:24:27.396] no rollbacker has been implemented for "ReplicationController"
I1207 02:24:27.396] unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I1207 02:24:27.396] has:Object 'Kind' is missing
I1207 02:24:27.495] Successful
I1207 02:24:27.496] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I1207 02:24:27.496] error: replicationcontrollers "busybox0" pausing is not supported
I1207 02:24:27.497] error: replicationcontrollers "busybox1" pausing is not supported
I1207 02:24:27.497] has:Object 'Kind' is missing
I1207 02:24:27.498] Successful
I1207 02:24:27.499] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I1207 02:24:27.499] error: replicationcontrollers "busybox0" pausing is not supported
I1207 02:24:27.499] error: replicationcontrollers "busybox1" pausing is not supported
I1207 02:24:27.499] has:replicationcontrollers "busybox0" pausing is not supported
I1207 02:24:27.500] Successful
I1207 02:24:27.501] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I1207 02:24:27.501] error: replicationcontrollers "busybox0" pausing is not supported
I1207 02:24:27.501] error: replicationcontrollers "busybox1" pausing is not supported
I1207 02:24:27.501] has:replicationcontrollers "busybox1" pausing is not supported
I1207 02:24:27.601] Successful
I1207 02:24:27.602] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I1207 02:24:27.602] error: replicationcontrollers "busybox0" resuming is not supported
I1207 02:24:27.602] error: replicationcontrollers "busybox1" resuming is not supported
I1207 02:24:27.602] has:Object 'Kind' is missing
I1207 02:24:27.604] Successful
I1207 02:24:27.604] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I1207 02:24:27.605] error: replicationcontrollers "busybox0" resuming is not supported
I1207 02:24:27.605] error: replicationcontrollers "busybox1" resuming is not supported
I1207 02:24:27.605] has:replicationcontrollers "busybox0" resuming is not supported
I1207 02:24:27.607] Successful
I1207 02:24:27.608] message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I1207 02:24:27.608] error: replicationcontrollers "busybox0" resuming is not supported
I1207 02:24:27.608] error: replicationcontrollers "busybox1" resuming is not supported
I1207 02:24:27.609] has:replicationcontrollers "busybox0" resuming is not supported
I1207 02:24:27.696] replicationcontroller "busybox0" force deleted
I1207 02:24:27.702] replicationcontroller "busybox1" force deleted
W1207 02:24:27.803] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
W1207 02:24:27.804] error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
I1207 02:24:28.726] +++ exit code: 0
I1207 02:24:29.102] Recording: run_namespace_tests
I1207 02:24:29.102] Running command: run_namespace_tests
I1207 02:24:29.125] 
I1207 02:24:29.127] +++ Running case: test-cmd.run_namespace_tests 
I1207 02:24:29.131] +++ working dir: /go/src/k8s.io/kubernetes
I1207 02:24:29.134] +++ command: run_namespace_tests
I1207 02:24:29.147] +++ [1207 02:24:29] Testing kubectl(v1:namespaces)
I1207 02:24:29.224] namespace/my-namespace created
I1207 02:24:29.325] core.sh:1295: Successful get namespaces/my-namespace {{.metadata.name}}: my-namespace
I1207 02:24:29.405] (Bnamespace "my-namespace" deleted
I1207 02:24:34.601] namespace/my-namespace condition met
I1207 02:24:34.698] Successful
I1207 02:24:34.699] message:Error from server (NotFound): namespaces "my-namespace" not found
I1207 02:24:34.699] has: not found
I1207 02:24:34.826] core.sh:1310: Successful get namespaces {{range.items}}{{ if eq $id_field \"other\" }}found{{end}}{{end}}:: :
I1207 02:24:34.909] (Bnamespace/other created
I1207 02:24:35.009] core.sh:1314: Successful get namespaces/other {{.metadata.name}}: other
I1207 02:24:35.111] (Bcore.sh:1318: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: 
I1207 02:24:35.281] (Bpod/valid-pod created
I1207 02:24:35.387] core.sh:1322: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I1207 02:24:35.489] (Bcore.sh:1324: Successful get pods -n other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I1207 02:24:35.580] (BSuccessful
I1207 02:24:35.580] message:error: a resource cannot be retrieved by name across all namespaces
I1207 02:24:35.581] has:a resource cannot be retrieved by name across all namespaces
I1207 02:24:35.680] core.sh:1331: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
I1207 02:24:35.773] (Bpod "valid-pod" force deleted
W1207 02:24:35.874] warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
I1207 02:24:35.974] core.sh:1335: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: 
I1207 02:24:35.975] (Bnamespace "other" deleted
... skipping 117 lines ...
I1207 02:24:57.876] +++ command: run_client_config_tests
I1207 02:24:57.892] +++ [1207 02:24:57] Creating namespace namespace-1544149497-17940
I1207 02:24:57.969] namespace/namespace-1544149497-17940 created
I1207 02:24:58.044] Context "test" modified.
I1207 02:24:58.051] +++ [1207 02:24:58] Testing client config
I1207 02:24:58.126] Successful
I1207 02:24:58.126] message:error: stat missing: no such file or directory
I1207 02:24:58.127] has:missing: no such file or directory
I1207 02:24:58.197] Successful
I1207 02:24:58.198] message:error: stat missing: no such file or directory
I1207 02:24:58.198] has:missing: no such file or directory
I1207 02:24:58.276] Successful
I1207 02:24:58.277] message:error: stat missing: no such file or directory
I1207 02:24:58.277] has:missing: no such file or directory
I1207 02:24:58.350] Successful
I1207 02:24:58.350] message:Error in configuration: context was not found for specified context: missing-context
I1207 02:24:58.350] has:context was not found for specified context: missing-context
I1207 02:24:58.425] Successful
I1207 02:24:58.426] message:error: no server found for cluster "missing-cluster"
I1207 02:24:58.426] has:no server found for cluster "missing-cluster"
I1207 02:24:58.503] Successful
I1207 02:24:58.503] message:error: auth info "missing-user" does not exist
I1207 02:24:58.503] has:auth info "missing-user" does not exist
I1207 02:24:58.646] Successful
I1207 02:24:58.646] message:error: Error loading config file "/tmp/newconfig.yaml": no kind "Config" is registered for version "v-1" in scheme "k8s.io/client-go/tools/clientcmd/api/latest/latest.go:50"
I1207 02:24:58.647] has:Error loading config file
I1207 02:24:58.720] Successful
I1207 02:24:58.720] message:error: stat missing-config: no such file or directory
I1207 02:24:58.720] has:no such file or directory
I1207 02:24:58.737] +++ exit code: 0
I1207 02:24:58.780] Recording: run_service_accounts_tests
I1207 02:24:58.780] Running command: run_service_accounts_tests
I1207 02:24:58.804] 
I1207 02:24:58.806] +++ Running case: test-cmd.run_service_accounts_tests 
... skipping 78 lines ...
I1207 02:25:06.361]                 job-name=test-job
I1207 02:25:06.362]                 run=pi
I1207 02:25:06.362] Annotations:    cronjob.kubernetes.io/instantiate: manual
I1207 02:25:06.362] Parallelism:    1
I1207 02:25:06.362] Completions:    1
I1207 02:25:06.362] Start Time:     Fri, 07 Dec 2018 02:25:06 +0000
I1207 02:25:06.362] Pods Statuses:  1 Running / 0 Succeeded / 0 Failed
I1207 02:25:06.363] Pod Template:
I1207 02:25:06.363]   Labels:  controller-uid=4f646ac7-f9c7-11e8-b772-0242ac110002
I1207 02:25:06.363]            job-name=test-job
I1207 02:25:06.363]            run=pi
I1207 02:25:06.363]   Containers:
I1207 02:25:06.363]    pi:
... skipping 327 lines ...
I1207 02:25:16.486]   selector:
I1207 02:25:16.486]     role: padawan
I1207 02:25:16.486]   sessionAffinity: None
I1207 02:25:16.486]   type: ClusterIP
I1207 02:25:16.486] status:
I1207 02:25:16.486]   loadBalancer: {}
W1207 02:25:16.587] error: you must specify resources by --filename when --local is set.
W1207 02:25:16.588] Example resource specifications include:
W1207 02:25:16.588]    '-f rsrc.yaml'
W1207 02:25:16.588]    '--filename=rsrc.json'
I1207 02:25:16.689] core.sh:886: Successful get services redis-master {{range.spec.selector}}{{.}}:{{end}}: redis:master:backend:
I1207 02:25:16.855] (Bcore.sh:893: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
I1207 02:25:16.944] (Bservice "redis-master" deleted
... skipping 93 lines ...
I1207 02:25:23.336] (Bapps.sh:80: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I1207 02:25:23.437] (Bapps.sh:81: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
I1207 02:25:23.550] (Bdaemonset.extensions/bind rolled back
I1207 02:25:23.657] apps.sh:84: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
I1207 02:25:23.756] (Bapps.sh:85: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
I1207 02:25:23.868] (BSuccessful
I1207 02:25:23.868] message:error: unable to find specified revision 1000000 in history
I1207 02:25:23.868] has:unable to find specified revision
I1207 02:25:23.969] apps.sh:89: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
I1207 02:25:24.069] (Bapps.sh:90: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
I1207 02:25:24.195] (Bdaemonset.extensions/bind rolled back
I1207 02:25:24.307] apps.sh:93: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:latest:
I1207 02:25:24.408] (Bapps.sh:94: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
... skipping 13 lines ...
I1207 02:25:24.967] core.sh:1008: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
I1207 02:25:25.123] (Breplicationcontroller/frontend created
I1207 02:25:25.214] replicationcontroller "frontend" deleted
I1207 02:25:25.317] core.sh:1013: Successful get pods -l "name=frontend" {{range.items}}{{.metadata.name}}:{{end}}: 
I1207 02:25:25.442] (Bcore.sh:1017: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
I1207 02:25:25.593] (Breplicationcontroller/frontend created
W1207 02:25:25.696] E1207 02:25:23.561500   55610 daemon_controller.go:303] namespace-1544149521-4527/bind failed with : error storing status for daemon set &v1.DaemonSet{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"bind", GenerateName:"", Namespace:"namespace-1544149521-4527", SelfLink:"/apis/apps/v1/namespaces/namespace-1544149521-4527/daemonsets/bind", UID:"58e6900f-f9c7-11e8-b772-0242ac110002", ResourceVersion:"1351", Generation:3, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63679746322, loc:(*time.Location)(0x66fa920)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string{"deprecated.daemonset.template.generation":"3", "kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"extensions/v1beta1\",\"kind\":\"DaemonSet\",\"metadata\":{\"annotations\":{\"kubernetes.io/change-cause\":\"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true\"},\"name\":\"bind\",\"namespace\":\"namespace-1544149521-4527\"},\"spec\":{\"template\":{\"metadata\":{\"labels\":{\"service\":\"bind\"}},\"spec\":{\"affinity\":{\"podAntiAffinity\":{\"requiredDuringSchedulingIgnoredDuringExecution\":[{\"labelSelector\":{\"matchExpressions\":[{\"key\":\"service\",\"operator\":\"In\",\"values\":[\"bind\"]}]},\"namespaces\":[],\"topologyKey\":\"kubernetes.io/hostname\"}]}},\"containers\":[{\"image\":\"k8s.gcr.io/pause:latest\",\"name\":\"kubernetes-pause\"},{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"app\"}]}},\"updateStrategy\":{\"rollingUpdate\":{\"maxUnavailable\":\"10%\"},\"type\":\"RollingUpdate\"}}}\n", "kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"}, OwnerReferences:[]v1.OwnerReference(nil), Initializers:(*v1.Initializers)(nil), Finalizers:[]string(nil), ClusterName:""}, Spec:v1.DaemonSetSpec{Selector:(*v1.LabelSelector)(0xc00143aaa0), Template:v1.PodTemplateSpec{ObjectMeta:v1.ObjectMeta{Name:"", GenerateName:"", Namespace:"", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:0, loc:(*time.Location)(nil)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Initializers:(*v1.Initializers)(nil), Finalizers:[]string(nil), ClusterName:""}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"kubernetes-pause", Image:"k8s.gcr.io/pause:2.0", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc002cf2618), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc002b6cde0), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(0xc00143aba0), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration(nil), HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(nil), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(nil)}}, UpdateStrategy:v1.DaemonSetUpdateStrategy{Type:"RollingUpdate", RollingUpdate:(*v1.RollingUpdateDaemonSet)(0xc000507980)}, MinReadySeconds:0, RevisionHistoryLimit:(*int32)(0xc002cf26c0)}, Status:v1.DaemonSetStatus{CurrentNumberScheduled:0, NumberMisscheduled:0, DesiredNumberScheduled:0, NumberReady:0, ObservedGeneration:2, UpdatedNumberScheduled:0, NumberAvailable:0, NumberUnavailable:0, CollisionCount:(*int32)(nil), Conditions:[]v1.DaemonSetCondition(nil)}}: Operation cannot be fulfilled on daemonsets.apps "bind": the object has been modified; please apply your changes to the latest version and try again
W1207 02:25:25.696] I1207 02:25:25.130351   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1544149524-28288", Name:"frontend", UID:"5ac0f896-f9c7-11e8-b772-0242ac110002", APIVersion:"v1", ResourceVersion:"1363", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-6wm82
W1207 02:25:25.697] I1207 02:25:25.133662   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1544149524-28288", Name:"frontend", UID:"5ac0f896-f9c7-11e8-b772-0242ac110002", APIVersion:"v1", ResourceVersion:"1363", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-629dr
W1207 02:25:25.697] I1207 02:25:25.135023   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1544149524-28288", Name:"frontend", UID:"5ac0f896-f9c7-11e8-b772-0242ac110002", APIVersion:"v1", ResourceVersion:"1363", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-tkvp8
W1207 02:25:25.698] I1207 02:25:25.596241   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1544149524-28288", Name:"frontend", UID:"5b08822f-f9c7-11e8-b772-0242ac110002", APIVersion:"v1", ResourceVersion:"1379", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-7gztt
W1207 02:25:25.698] I1207 02:25:25.601801   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1544149524-28288", Name:"frontend", UID:"5b08822f-f9c7-11e8-b772-0242ac110002", APIVersion:"v1", ResourceVersion:"1379", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-756ls
W1207 02:25:25.698] I1207 02:25:25.601870   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1544149524-28288", Name:"frontend", UID:"5b08822f-f9c7-11e8-b772-0242ac110002", APIVersion:"v1", ResourceVersion:"1379", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-788zl
... skipping 3 lines ...
I1207 02:25:25.868] Namespace:    namespace-1544149524-28288
I1207 02:25:25.868] Selector:     app=guestbook,tier=frontend
I1207 02:25:25.868] Labels:       app=guestbook
I1207 02:25:25.868]               tier=frontend
I1207 02:25:25.868] Annotations:  <none>
I1207 02:25:25.868] Replicas:     3 current / 3 desired
I1207 02:25:25.869] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I1207 02:25:25.869] Pod Template:
I1207 02:25:25.869]   Labels:  app=guestbook
I1207 02:25:25.869]            tier=frontend
I1207 02:25:25.869]   Containers:
I1207 02:25:25.869]    php-redis:
I1207 02:25:25.869]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
I1207 02:25:25.996] Namespace:    namespace-1544149524-28288
I1207 02:25:25.997] Selector:     app=guestbook,tier=frontend
I1207 02:25:25.997] Labels:       app=guestbook
I1207 02:25:25.997]               tier=frontend
I1207 02:25:25.997] Annotations:  <none>
I1207 02:25:25.997] Replicas:     3 current / 3 desired
I1207 02:25:25.997] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I1207 02:25:25.997] Pod Template:
I1207 02:25:25.997]   Labels:  app=guestbook
I1207 02:25:25.997]            tier=frontend
I1207 02:25:25.997]   Containers:
I1207 02:25:25.998]    php-redis:
I1207 02:25:25.998]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 18 lines ...
I1207 02:25:26.119] Namespace:    namespace-1544149524-28288
I1207 02:25:26.119] Selector:     app=guestbook,tier=frontend
I1207 02:25:26.119] Labels:       app=guestbook
I1207 02:25:26.119]               tier=frontend
I1207 02:25:26.119] Annotations:  <none>
I1207 02:25:26.119] Replicas:     3 current / 3 desired
I1207 02:25:26.120] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I1207 02:25:26.120] Pod Template:
I1207 02:25:26.120]   Labels:  app=guestbook
I1207 02:25:26.120]            tier=frontend
I1207 02:25:26.120]   Containers:
I1207 02:25:26.120]    php-redis:
I1207 02:25:26.120]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 12 lines ...
I1207 02:25:26.240] Namespace:    namespace-1544149524-28288
I1207 02:25:26.240] Selector:     app=guestbook,tier=frontend
I1207 02:25:26.240] Labels:       app=guestbook
I1207 02:25:26.240]               tier=frontend
I1207 02:25:26.240] Annotations:  <none>
I1207 02:25:26.241] Replicas:     3 current / 3 desired
I1207 02:25:26.241] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I1207 02:25:26.241] Pod Template:
I1207 02:25:26.241]   Labels:  app=guestbook
I1207 02:25:26.241]            tier=frontend
I1207 02:25:26.241]   Containers:
I1207 02:25:26.241]    php-redis:
I1207 02:25:26.242]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 18 lines ...
I1207 02:25:26.404] Namespace:    namespace-1544149524-28288
I1207 02:25:26.405] Selector:     app=guestbook,tier=frontend
I1207 02:25:26.405] Labels:       app=guestbook
I1207 02:25:26.405]               tier=frontend
I1207 02:25:26.405] Annotations:  <none>
I1207 02:25:26.405] Replicas:     3 current / 3 desired
I1207 02:25:26.405] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I1207 02:25:26.405] Pod Template:
I1207 02:25:26.406]   Labels:  app=guestbook
I1207 02:25:26.406]            tier=frontend
I1207 02:25:26.406]   Containers:
I1207 02:25:26.406]    php-redis:
I1207 02:25:26.406]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
I1207 02:25:26.527] Namespace:    namespace-1544149524-28288
I1207 02:25:26.527] Selector:     app=guestbook,tier=frontend
I1207 02:25:26.527] Labels:       app=guestbook
I1207 02:25:26.527]               tier=frontend
I1207 02:25:26.528] Annotations:  <none>
I1207 02:25:26.528] Replicas:     3 current / 3 desired
I1207 02:25:26.528] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I1207 02:25:26.528] Pod Template:
I1207 02:25:26.528]   Labels:  app=guestbook
I1207 02:25:26.528]            tier=frontend
I1207 02:25:26.528]   Containers:
I1207 02:25:26.528]    php-redis:
I1207 02:25:26.529]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
I1207 02:25:26.644] Namespace:    namespace-1544149524-28288
I1207 02:25:26.644] Selector:     app=guestbook,tier=frontend
I1207 02:25:26.644] Labels:       app=guestbook
I1207 02:25:26.644]               tier=frontend
I1207 02:25:26.644] Annotations:  <none>
I1207 02:25:26.644] Replicas:     3 current / 3 desired
I1207 02:25:26.645] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I1207 02:25:26.645] Pod Template:
I1207 02:25:26.645]   Labels:  app=guestbook
I1207 02:25:26.645]            tier=frontend
I1207 02:25:26.645]   Containers:
I1207 02:25:26.645]    php-redis:
I1207 02:25:26.646]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 11 lines ...
I1207 02:25:26.764] Namespace:    namespace-1544149524-28288
I1207 02:25:26.764] Selector:     app=guestbook,tier=frontend
I1207 02:25:26.764] Labels:       app=guestbook
I1207 02:25:26.764]               tier=frontend
I1207 02:25:26.765] Annotations:  <none>
I1207 02:25:26.765] Replicas:     3 current / 3 desired
I1207 02:25:26.765] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I1207 02:25:26.765] Pod Template:
I1207 02:25:26.765]   Labels:  app=guestbook
I1207 02:25:26.765]            tier=frontend
I1207 02:25:26.765]   Containers:
I1207 02:25:26.766]    php-redis:
I1207 02:25:26.766]     Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 22 lines ...
I1207 02:25:27.667] core.sh:1061: Successful get rc frontend {{.spec.replicas}}: 3
I1207 02:25:27.763] (Bcore.sh:1065: Successful get rc frontend {{.spec.replicas}}: 3
I1207 02:25:27.861] (Breplicationcontroller/frontend scaled
I1207 02:25:27.962] core.sh:1069: Successful get rc frontend {{.spec.replicas}}: 2
I1207 02:25:28.054] (Breplicationcontroller "frontend" deleted
W1207 02:25:28.155] I1207 02:25:26.970116   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1544149524-28288", Name:"frontend", UID:"5b08822f-f9c7-11e8-b772-0242ac110002", APIVersion:"v1", ResourceVersion:"1389", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: frontend-788zl
W1207 02:25:28.155] error: Expected replicas to be 3, was 2
W1207 02:25:28.155] I1207 02:25:27.568241   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1544149524-28288", Name:"frontend", UID:"5b08822f-f9c7-11e8-b772-0242ac110002", APIVersion:"v1", ResourceVersion:"1395", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-m5jsx
W1207 02:25:28.156] I1207 02:25:27.868260   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1544149524-28288", Name:"frontend", UID:"5b08822f-f9c7-11e8-b772-0242ac110002", APIVersion:"v1", ResourceVersion:"1400", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: frontend-m5jsx
W1207 02:25:28.228] I1207 02:25:28.227475   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1544149524-28288", Name:"redis-master", UID:"5c99eb3e-f9c7-11e8-b772-0242ac110002", APIVersion:"v1", ResourceVersion:"1412", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-jh42s
I1207 02:25:28.329] replicationcontroller/redis-master created
I1207 02:25:28.388] replicationcontroller/redis-slave created
W1207 02:25:28.489] I1207 02:25:28.392788   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1544149524-28288", Name:"redis-slave", UID:"5cb31c8f-f9c7-11e8-b772-0242ac110002", APIVersion:"v1", ResourceVersion:"1417", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-2fgqd
... skipping 36 lines ...
I1207 02:25:30.112] service "expose-test-deployment" deleted
I1207 02:25:30.221] Successful
I1207 02:25:30.221] message:service/expose-test-deployment exposed
I1207 02:25:30.222] has:service/expose-test-deployment exposed
I1207 02:25:30.310] service "expose-test-deployment" deleted
I1207 02:25:30.408] Successful
I1207 02:25:30.408] message:error: couldn't retrieve selectors via --selector flag or introspection: invalid deployment: no selectors, therefore cannot be exposed
I1207 02:25:30.408] See 'kubectl expose -h' for help and examples
I1207 02:25:30.408] has:invalid deployment: no selectors
I1207 02:25:30.500] Successful
I1207 02:25:30.501] message:error: couldn't retrieve selectors via --selector flag or introspection: invalid deployment: no selectors, therefore cannot be exposed
I1207 02:25:30.501] See 'kubectl expose -h' for help and examples
I1207 02:25:30.501] has:invalid deployment: no selectors
I1207 02:25:30.653] deployment.extensions/nginx-deployment created
I1207 02:25:30.755] core.sh:1133: Successful get deployment nginx-deployment {{.spec.replicas}}: 3
I1207 02:25:30.853] (Bservice/nginx-deployment exposed
I1207 02:25:30.953] core.sh:1137: Successful get service nginx-deployment {{(index .spec.ports 0).port}}: 80
... skipping 23 lines ...
I1207 02:25:32.682] service "frontend" deleted
I1207 02:25:32.691] service "frontend-2" deleted
I1207 02:25:32.701] service "frontend-3" deleted
I1207 02:25:32.712] service "frontend-4" deleted
I1207 02:25:32.724] service "frontend-5" deleted
I1207 02:25:32.833] Successful
I1207 02:25:32.833] message:error: cannot expose a Node
I1207 02:25:32.833] has:cannot expose
I1207 02:25:32.930] Successful
I1207 02:25:32.931] message:The Service "invalid-large-service-name-that-has-more-than-sixty-three-characters" is invalid: metadata.name: Invalid value: "invalid-large-service-name-that-has-more-than-sixty-three-characters": must be no more than 63 characters
I1207 02:25:32.931] has:metadata.name: Invalid value
I1207 02:25:33.032] Successful
I1207 02:25:33.033] message:service/kubernetes-serve-hostname-testing-sixty-three-characters-in-len exposed
... skipping 30 lines ...
I1207 02:25:35.103] (Bhorizontalpodautoscaler.autoscaling/frontend autoscaled
I1207 02:25:35.200] core.sh:1233: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 70
I1207 02:25:35.291] (Bhorizontalpodautoscaler.autoscaling "frontend" deleted
I1207 02:25:35.395] horizontalpodautoscaler.autoscaling/frontend autoscaled
I1207 02:25:35.499] core.sh:1237: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
I1207 02:25:35.590] (Bhorizontalpodautoscaler.autoscaling "frontend" deleted
W1207 02:25:35.691] Error: required flag(s) "max" not set
W1207 02:25:35.692] 
W1207 02:25:35.692] 
W1207 02:25:35.692] Examples:
W1207 02:25:35.692]   # Auto scale a deployment "foo", with the number of pods between 2 and 10, no target CPU utilization specified so a default autoscaling policy will be used:
W1207 02:25:35.692]   kubectl autoscale deployment foo --min=2 --max=10
W1207 02:25:35.692]   
... skipping 54 lines ...
I1207 02:25:35.949]           limits:
I1207 02:25:35.949]             cpu: 300m
I1207 02:25:35.949]           requests:
I1207 02:25:35.949]             cpu: 300m
I1207 02:25:35.949]       terminationGracePeriodSeconds: 0
I1207 02:25:35.949] status: {}
W1207 02:25:36.050] Error from server (NotFound): deployments.extensions "nginx-deployment-resources" not found
I1207 02:25:36.197] deployment.extensions/nginx-deployment-resources created
I1207 02:25:36.304] core.sh:1252: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment-resources:
I1207 02:25:36.403] (Bcore.sh:1253: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I1207 02:25:36.514] (Bcore.sh:1254: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
I1207 02:25:36.615] (Bdeployment.extensions/nginx-deployment-resources resource requirements updated
W1207 02:25:36.716] I1207 02:25:36.200295   55610 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544149524-28288", Name:"nginx-deployment-resources", UID:"615a8076-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1657", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-69c96fd869 to 3
... skipping 2 lines ...
W1207 02:25:36.717] I1207 02:25:36.209524   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149524-28288", Name:"nginx-deployment-resources-69c96fd869", UID:"615b2e8a-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1658", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-69c96fd869-clt8h
W1207 02:25:36.718] I1207 02:25:36.619483   55610 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544149524-28288", Name:"nginx-deployment-resources", UID:"615a8076-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1672", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-6c5996c457 to 1
W1207 02:25:36.718] I1207 02:25:36.623434   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149524-28288", Name:"nginx-deployment-resources-6c5996c457", UID:"619b0ac8-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1673", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-6c5996c457-24cpz
W1207 02:25:36.718] I1207 02:25:36.627237   55610 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544149524-28288", Name:"nginx-deployment-resources", UID:"615a8076-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1672", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-resources-69c96fd869 to 2
W1207 02:25:36.718] I1207 02:25:36.632679   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149524-28288", Name:"nginx-deployment-resources-69c96fd869", UID:"615b2e8a-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1678", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-resources-69c96fd869-vgcfb
W1207 02:25:36.719] I1207 02:25:36.635316   55610 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544149524-28288", Name:"nginx-deployment-resources", UID:"615a8076-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1676", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-6c5996c457 to 2
W1207 02:25:36.719] E1207 02:25:36.635820   55610 replica_set.go:450] Sync "namespace-1544149524-28288/nginx-deployment-resources-6c5996c457" failed with Operation cannot be fulfilled on replicasets.apps "nginx-deployment-resources-6c5996c457": the object has been modified; please apply your changes to the latest version and try again
W1207 02:25:36.719] I1207 02:25:36.640158   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149524-28288", Name:"nginx-deployment-resources-6c5996c457", UID:"619b0ac8-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1683", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-6c5996c457-gvjtk
I1207 02:25:36.820] core.sh:1257: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 100m:
I1207 02:25:36.824] (Bcore.sh:1258: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 100m:
I1207 02:25:37.014] (Bdeployment.extensions/nginx-deployment-resources resource requirements updated
W1207 02:25:37.114] error: unable to find container named redis
W1207 02:25:37.115] I1207 02:25:37.025987   55610 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544149524-28288", Name:"nginx-deployment-resources", UID:"615a8076-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1696", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-resources-69c96fd869 to 0
W1207 02:25:37.116] I1207 02:25:37.031666   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149524-28288", Name:"nginx-deployment-resources-69c96fd869", UID:"615b2e8a-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1700", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-resources-69c96fd869-5pzk2
W1207 02:25:37.116] I1207 02:25:37.031745   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149524-28288", Name:"nginx-deployment-resources-69c96fd869", UID:"615b2e8a-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1700", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-resources-69c96fd869-clt8h
W1207 02:25:37.116] I1207 02:25:37.034325   55610 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544149524-28288", Name:"nginx-deployment-resources", UID:"615a8076-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1699", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-5f4579485f to 2
W1207 02:25:37.117] I1207 02:25:37.038115   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149524-28288", Name:"nginx-deployment-resources-5f4579485f", UID:"61d7d1da-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1706", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-5f4579485f-56982
W1207 02:25:37.117] I1207 02:25:37.042605   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149524-28288", Name:"nginx-deployment-resources-5f4579485f", UID:"61d7d1da-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1706", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-5f4579485f-nr4xr
... skipping 77 lines ...
I1207 02:25:37.736]     status: "False"
I1207 02:25:37.736]     type: Available
I1207 02:25:37.736]   observedGeneration: 4
I1207 02:25:37.736]   replicas: 6
I1207 02:25:37.736]   unavailableReplicas: 4
I1207 02:25:37.736]   updatedReplicas: 2
W1207 02:25:37.837] error: you must specify resources by --filename when --local is set.
W1207 02:25:37.837] Example resource specifications include:
W1207 02:25:37.838]    '-f rsrc.yaml'
W1207 02:25:37.838]    '--filename=rsrc.json'
I1207 02:25:37.938] core.sh:1273: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
I1207 02:25:38.018] (Bcore.sh:1274: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 300m:
I1207 02:25:38.121] (Bcore.sh:1275: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.requests.cpu}}:{{end}}: 300m:
... skipping 44 lines ...
I1207 02:25:39.731]                 pod-template-hash=55c9b846cc
I1207 02:25:39.731] Annotations:    deployment.kubernetes.io/desired-replicas: 1
I1207 02:25:39.732]                 deployment.kubernetes.io/max-replicas: 2
I1207 02:25:39.732]                 deployment.kubernetes.io/revision: 1
I1207 02:25:39.732] Controlled By:  Deployment/test-nginx-apps
I1207 02:25:39.732] Replicas:       1 current / 1 desired
I1207 02:25:39.732] Pods Status:    0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I1207 02:25:39.732] Pod Template:
I1207 02:25:39.732]   Labels:  app=test-nginx-apps
I1207 02:25:39.733]            pod-template-hash=55c9b846cc
I1207 02:25:39.733]   Containers:
I1207 02:25:39.733]    nginx:
I1207 02:25:39.733]     Image:        k8s.gcr.io/nginx:test-cmd
... skipping 86 lines ...
W1207 02:25:43.850] I1207 02:25:43.185217   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149538-14206", Name:"nginx-6f6bb85d9c", UID:"6583e9f8-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1888", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-6f6bb85d9c-cgf6h
W1207 02:25:43.850] Warning: kubectl apply should be used on resource created by either kubectl create --save-config or kubectl apply
W1207 02:25:43.850] I1207 02:25:43.751778   55610 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544149538-14206", Name:"nginx", UID:"65832f84-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1901", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-9486b7cb7 to 1
W1207 02:25:43.851] I1207 02:25:43.754978   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149538-14206", Name:"nginx-9486b7cb7", UID:"65db6228-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1902", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-9486b7cb7-g54bv
W1207 02:25:43.851] I1207 02:25:43.759188   55610 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544149538-14206", Name:"nginx", UID:"65832f84-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1901", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-6f6bb85d9c to 2
W1207 02:25:43.851] I1207 02:25:43.764716   55610 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544149538-14206", Name:"nginx", UID:"65832f84-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1905", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-9486b7cb7 to 2
W1207 02:25:43.852] E1207 02:25:43.766547   55610 replica_set.go:450] Sync "namespace-1544149538-14206/nginx-9486b7cb7" failed with Operation cannot be fulfilled on replicasets.apps "nginx-9486b7cb7": the object has been modified; please apply your changes to the latest version and try again
W1207 02:25:43.852] I1207 02:25:43.766879   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149538-14206", Name:"nginx-6f6bb85d9c", UID:"6583e9f8-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1908", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-6f6bb85d9c-cgf6h
W1207 02:25:43.853] I1207 02:25:43.771331   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149538-14206", Name:"nginx-9486b7cb7", UID:"65db6228-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1911", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-9486b7cb7-6xgg8
I1207 02:25:43.953] apps.sh:293: Successful get deployment.extensions {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
I1207 02:25:43.969] (B    Image:	k8s.gcr.io/nginx:test-cmd
I1207 02:25:44.073] apps.sh:296: Successful get deployment.extensions {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
I1207 02:25:44.192] (Bdeployment.extensions/nginx rolled back
I1207 02:25:45.295] apps.sh:300: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I1207 02:25:45.507] (Bapps.sh:303: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
I1207 02:25:45.626] (Bdeployment.extensions/nginx rolled back
W1207 02:25:45.726] error: unable to find specified revision 1000000 in history
I1207 02:25:46.732] apps.sh:307: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
I1207 02:25:46.831] (Bdeployment.extensions/nginx paused
W1207 02:25:46.951] error: you cannot rollback a paused deployment; resume it first with 'kubectl rollout resume deployment/nginx' and try again
I1207 02:25:47.052] deployment.extensions/nginx resumed
I1207 02:25:47.173] deployment.extensions/nginx rolled back
I1207 02:25:47.371]     deployment.kubernetes.io/revision-history: 1,3
W1207 02:25:47.566] error: desired revision (3) is different from the running revision (5)
I1207 02:25:47.719] deployment.extensions/nginx2 created
I1207 02:25:47.809] deployment.extensions "nginx2" deleted
I1207 02:25:47.900] deployment.extensions "nginx" deleted
I1207 02:25:48.003] apps.sh:329: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
I1207 02:25:48.168] (Bdeployment.extensions/nginx-deployment created
W1207 02:25:48.269] I1207 02:25:47.722380   55610 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544149538-14206", Name:"nginx2", UID:"6838b0e9-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1943", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx2-6b58f7cc65 to 3
... skipping 22 lines ...
W1207 02:25:49.945] I1207 02:25:48.574951   55610 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544149538-14206", Name:"nginx-deployment", UID:"687d27c6-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1991", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-85db47bbdb to 1
W1207 02:25:49.945] I1207 02:25:48.578933   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149538-14206", Name:"nginx-deployment-85db47bbdb", UID:"68bb5f4b-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1992", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-85db47bbdb-546r8
W1207 02:25:49.945] I1207 02:25:48.581752   55610 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544149538-14206", Name:"nginx-deployment", UID:"687d27c6-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1991", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-646d4f779d to 2
W1207 02:25:49.946] I1207 02:25:48.589081   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149538-14206", Name:"nginx-deployment-646d4f779d", UID:"687deb1d-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1996", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-646d4f779d-lphj7
W1207 02:25:49.946] I1207 02:25:48.589453   55610 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544149538-14206", Name:"nginx-deployment", UID:"687d27c6-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"1994", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-85db47bbdb to 2
W1207 02:25:49.946] I1207 02:25:48.591652   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149538-14206", Name:"nginx-deployment-85db47bbdb", UID:"68bb5f4b-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2002", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-85db47bbdb-btzpk
W1207 02:25:49.946] error: unable to find container named "redis"
W1207 02:25:49.947] I1207 02:25:49.856161   55610 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544149538-14206", Name:"nginx-deployment", UID:"687d27c6-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2024", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-646d4f779d to 0
W1207 02:25:49.947] I1207 02:25:49.862418   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149538-14206", Name:"nginx-deployment-646d4f779d", UID:"687deb1d-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2028", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-646d4f779d-tc686
W1207 02:25:49.947] I1207 02:25:49.864018   55610 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544149538-14206", Name:"nginx-deployment", UID:"687d27c6-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2026", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-dc756cc6 to 2
W1207 02:25:49.948] I1207 02:25:49.869464   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149538-14206", Name:"nginx-deployment-646d4f779d", UID:"687deb1d-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2028", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-646d4f779d-dchjz
W1207 02:25:49.948] I1207 02:25:49.869965   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149538-14206", Name:"nginx-deployment-dc756cc6", UID:"697dad3d-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2034", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-dc756cc6-lkzsf
W1207 02:25:49.948] I1207 02:25:49.870122   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149538-14206", Name:"nginx-deployment-dc756cc6", UID:"697dad3d-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2034", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-dc756cc6-58qpf
... skipping 17 lines ...
I1207 02:25:51.412] (Bdeployment.extensions/nginx-deployment env updated
W1207 02:25:51.514] I1207 02:25:51.416671   55610 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544149538-14206", Name:"nginx-deployment", UID:"69fd9293-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2080", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-5b795689cd to 1
W1207 02:25:51.515] I1207 02:25:51.421742   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149538-14206", Name:"nginx-deployment-5b795689cd", UID:"6a6cf37f-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2081", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-5b795689cd-7d8sr
W1207 02:25:51.515] I1207 02:25:51.425106   55610 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544149538-14206", Name:"nginx-deployment", UID:"69fd9293-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2080", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-646d4f779d to 2
W1207 02:25:51.515] I1207 02:25:51.431027   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149538-14206", Name:"nginx-deployment-646d4f779d", UID:"69fe50fa-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2085", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-646d4f779d-f79mz
W1207 02:25:51.516] I1207 02:25:51.432332   55610 event.go:221] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1544149538-14206", Name:"nginx-deployment", UID:"69fd9293-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2082", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-5b795689cd to 2
W1207 02:25:51.516] E1207 02:25:51.433948   55610 replica_set.go:450] Sync "namespace-1544149538-14206/nginx-deployment-5b795689cd" failed with Operation cannot be fulfilled on replicasets.apps "nginx-deployment-5b795689cd": the object has been modified; please apply your changes to the latest version and try again
W1207 02:25:51.517] I1207 02:25:51.438190   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149538-14206", Name:"nginx-deployment-5b795689cd", UID:"6a6cf37f-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2091", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-5b795689cd-tvsss
I1207 02:25:51.617] apps.sh:378: Successful get deploy nginx-deployment {{ (index (index .spec.template.spec.containers 0).env 0).name}}: KEY_2
I1207 02:25:51.626] (Bapps.sh:380: Successful get deploy nginx-deployment {{ len (index .spec.template.spec.containers 0).env }}: 1
I1207 02:25:51.737] (Bdeployment.extensions/nginx-deployment env updated
I1207 02:25:51.844] apps.sh:384: Successful get deploy nginx-deployment {{ len (index .spec.template.spec.containers 0).env }}: 2
I1207 02:25:51.943] (Bdeployment.extensions/nginx-deployment env updated
... skipping 32 lines ...
I1207 02:25:52.942] Context "test" modified.
I1207 02:25:52.948] +++ [1207 02:25:52] Testing kubectl(v1:replicasets)
I1207 02:25:53.043] apps.sh:502: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
I1207 02:25:53.198] (Breplicaset.extensions/frontend created
I1207 02:25:53.208] +++ [1207 02:25:53] Deleting rs
I1207 02:25:53.296] replicaset.extensions "frontend" deleted
W1207 02:25:53.397] E1207 02:25:52.770290   55610 replica_set.go:450] Sync "namespace-1544149538-14206/nginx-deployment-65b869c68c" failed with Operation cannot be fulfilled on replicasets.apps "nginx-deployment-65b869c68c": StorageError: invalid object, Code: 4, Key: /registry/replicasets/namespace-1544149538-14206/nginx-deployment-65b869c68c, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 6acf6c53-f9c7-11e8-b772-0242ac110002, UID in object meta: 
W1207 02:25:53.398] E1207 02:25:52.820121   55610 replica_set.go:450] Sync "namespace-1544149538-14206/nginx-deployment-669d4f8fc9" failed with replicasets.apps "nginx-deployment-669d4f8fc9" not found
W1207 02:25:53.398] E1207 02:25:52.869526   55610 replica_set.go:450] Sync "namespace-1544149538-14206/nginx-deployment-794dcdf6bb" failed with replicasets.apps "nginx-deployment-794dcdf6bb" not found
W1207 02:25:53.398] E1207 02:25:52.919736   55610 replica_set.go:450] Sync "namespace-1544149538-14206/nginx-deployment-7b8f7659b7" failed with replicasets.apps "nginx-deployment-7b8f7659b7" not found
W1207 02:25:53.398] E1207 02:25:52.969654   55610 replica_set.go:450] Sync "namespace-1544149538-14206/nginx-deployment-5b795689cd" failed with replicasets.apps "nginx-deployment-5b795689cd" not found
W1207 02:25:53.399] I1207 02:25:53.204945   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149552-16373", Name:"frontend", UID:"6b7cc371-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2190", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-fbnrm
W1207 02:25:53.399] I1207 02:25:53.208715   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149552-16373", Name:"frontend", UID:"6b7cc371-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2190", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-zvdfl
W1207 02:25:53.400] I1207 02:25:53.271381   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149552-16373", Name:"frontend", UID:"6b7cc371-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2190", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-l885m
W1207 02:25:53.470] E1207 02:25:53.469789   55610 replica_set.go:450] Sync "namespace-1544149552-16373/frontend" failed with replicasets.apps "frontend" not found
I1207 02:25:53.571] apps.sh:508: Successful get pods -l "tier=frontend" {{range.items}}{{.metadata.name}}:{{end}}: 
I1207 02:25:53.572] (Bapps.sh:512: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
I1207 02:25:53.655] (Breplicaset.extensions/frontend-no-cascade created
W1207 02:25:53.756] I1207 02:25:53.660047   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149552-16373", Name:"frontend-no-cascade", UID:"6bc2805d-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2204", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-no-cascade-6qbxw
W1207 02:25:53.756] I1207 02:25:53.663258   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149552-16373", Name:"frontend-no-cascade", UID:"6bc2805d-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2204", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-no-cascade-v6zkn
W1207 02:25:53.757] I1207 02:25:53.671110   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149552-16373", Name:"frontend-no-cascade", UID:"6bc2805d-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2204", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-no-cascade-pjbfg
I1207 02:25:53.857] apps.sh:518: Successful get pods -l "tier=frontend" {{range.items}}{{(index .spec.containers 0).name}}:{{end}}: php-redis:php-redis:php-redis:
I1207 02:25:53.858] (B+++ [1207 02:25:53] Deleting rs
I1207 02:25:53.858] replicaset.extensions "frontend-no-cascade" deleted
W1207 02:25:53.959] E1207 02:25:53.919381   55610 replica_set.go:450] Sync "namespace-1544149552-16373/frontend-no-cascade" failed with replicasets.apps "frontend-no-cascade" not found
I1207 02:25:54.059] apps.sh:522: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
I1207 02:25:54.064] (Bapps.sh:524: Successful get pods -l "tier=frontend" {{range.items}}{{(index .spec.containers 0).name}}:{{end}}: php-redis:php-redis:php-redis:
I1207 02:25:54.155] (Bpod "frontend-no-cascade-6qbxw" deleted
I1207 02:25:54.163] pod "frontend-no-cascade-pjbfg" deleted
I1207 02:25:54.170] pod "frontend-no-cascade-v6zkn" deleted
I1207 02:25:54.277] apps.sh:527: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
... skipping 5 lines ...
I1207 02:25:54.804] Namespace:    namespace-1544149552-16373
I1207 02:25:54.804] Selector:     app=guestbook,tier=frontend
I1207 02:25:54.804] Labels:       app=guestbook
I1207 02:25:54.804]               tier=frontend
I1207 02:25:54.805] Annotations:  <none>
I1207 02:25:54.805] Replicas:     3 current / 3 desired
I1207 02:25:54.805] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I1207 02:25:54.805] Pod Template:
I1207 02:25:54.805]   Labels:  app=guestbook
I1207 02:25:54.805]            tier=frontend
I1207 02:25:54.805]   Containers:
I1207 02:25:54.805]    php-redis:
I1207 02:25:54.805]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
I1207 02:25:54.931] Namespace:    namespace-1544149552-16373
I1207 02:25:54.931] Selector:     app=guestbook,tier=frontend
I1207 02:25:54.931] Labels:       app=guestbook
I1207 02:25:54.931]               tier=frontend
I1207 02:25:54.931] Annotations:  <none>
I1207 02:25:54.931] Replicas:     3 current / 3 desired
I1207 02:25:54.931] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I1207 02:25:54.931] Pod Template:
I1207 02:25:54.931]   Labels:  app=guestbook
I1207 02:25:54.932]            tier=frontend
I1207 02:25:54.932]   Containers:
I1207 02:25:54.932]    php-redis:
I1207 02:25:54.932]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 21 lines ...
I1207 02:25:55.136] Namespace:    namespace-1544149552-16373
I1207 02:25:55.136] Selector:     app=guestbook,tier=frontend
I1207 02:25:55.136] Labels:       app=guestbook
I1207 02:25:55.136]               tier=frontend
I1207 02:25:55.136] Annotations:  <none>
I1207 02:25:55.136] Replicas:     3 current / 3 desired
I1207 02:25:55.137] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I1207 02:25:55.137] Pod Template:
I1207 02:25:55.137]   Labels:  app=guestbook
I1207 02:25:55.137]            tier=frontend
I1207 02:25:55.137]   Containers:
I1207 02:25:55.137]    php-redis:
I1207 02:25:55.137]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 12 lines ...
I1207 02:25:55.170] Namespace:    namespace-1544149552-16373
I1207 02:25:55.170] Selector:     app=guestbook,tier=frontend
I1207 02:25:55.170] Labels:       app=guestbook
I1207 02:25:55.171]               tier=frontend
I1207 02:25:55.171] Annotations:  <none>
I1207 02:25:55.171] Replicas:     3 current / 3 desired
I1207 02:25:55.171] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I1207 02:25:55.171] Pod Template:
I1207 02:25:55.171]   Labels:  app=guestbook
I1207 02:25:55.171]            tier=frontend
I1207 02:25:55.171]   Containers:
I1207 02:25:55.171]    php-redis:
I1207 02:25:55.172]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 18 lines ...
I1207 02:25:55.328] Namespace:    namespace-1544149552-16373
I1207 02:25:55.328] Selector:     app=guestbook,tier=frontend
I1207 02:25:55.328] Labels:       app=guestbook
I1207 02:25:55.328]               tier=frontend
I1207 02:25:55.328] Annotations:  <none>
I1207 02:25:55.329] Replicas:     3 current / 3 desired
I1207 02:25:55.329] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I1207 02:25:55.329] Pod Template:
I1207 02:25:55.329]   Labels:  app=guestbook
I1207 02:25:55.329]            tier=frontend
I1207 02:25:55.329]   Containers:
I1207 02:25:55.329]    php-redis:
I1207 02:25:55.329]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
I1207 02:25:55.446] Namespace:    namespace-1544149552-16373
I1207 02:25:55.446] Selector:     app=guestbook,tier=frontend
I1207 02:25:55.446] Labels:       app=guestbook
I1207 02:25:55.447]               tier=frontend
I1207 02:25:55.447] Annotations:  <none>
I1207 02:25:55.447] Replicas:     3 current / 3 desired
I1207 02:25:55.447] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I1207 02:25:55.447] Pod Template:
I1207 02:25:55.447]   Labels:  app=guestbook
I1207 02:25:55.447]            tier=frontend
I1207 02:25:55.447]   Containers:
I1207 02:25:55.447]    php-redis:
I1207 02:25:55.447]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
I1207 02:25:55.562] Namespace:    namespace-1544149552-16373
I1207 02:25:55.562] Selector:     app=guestbook,tier=frontend
I1207 02:25:55.563] Labels:       app=guestbook
I1207 02:25:55.563]               tier=frontend
I1207 02:25:55.563] Annotations:  <none>
I1207 02:25:55.563] Replicas:     3 current / 3 desired
I1207 02:25:55.563] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I1207 02:25:55.563] Pod Template:
I1207 02:25:55.563]   Labels:  app=guestbook
I1207 02:25:55.563]            tier=frontend
I1207 02:25:55.563]   Containers:
I1207 02:25:55.563]    php-redis:
I1207 02:25:55.563]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 11 lines ...
I1207 02:25:55.684] Namespace:    namespace-1544149552-16373
I1207 02:25:55.685] Selector:     app=guestbook,tier=frontend
I1207 02:25:55.685] Labels:       app=guestbook
I1207 02:25:55.685]               tier=frontend
I1207 02:25:55.685] Annotations:  <none>
I1207 02:25:55.685] Replicas:     3 current / 3 desired
I1207 02:25:55.685] Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
I1207 02:25:55.685] Pod Template:
I1207 02:25:55.685]   Labels:  app=guestbook
I1207 02:25:55.686]            tier=frontend
I1207 02:25:55.686]   Containers:
I1207 02:25:55.686]    php-redis:
I1207 02:25:55.686]     Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 184 lines ...
W1207 02:26:01.317] I1207 02:26:00.836256   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149552-16373", Name:"frontend", UID:"7009aa2c-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2415", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-jd952
W1207 02:26:01.318] I1207 02:26:00.839433   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149552-16373", Name:"frontend", UID:"7009aa2c-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2415", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-9wfnm
W1207 02:26:01.318] I1207 02:26:00.839575   55610 event.go:221] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1544149552-16373", Name:"frontend", UID:"7009aa2c-f9c7-11e8-b772-0242ac110002", APIVersion:"apps/v1", ResourceVersion:"2415", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-qvpjf
I1207 02:26:01.419] horizontalpodautoscaler.autoscaling/frontend autoscaled
I1207 02:26:01.426] apps.sh:647: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
I1207 02:26:01.516] (Bhorizontalpodautoscaler.autoscaling "frontend" deleted
W1207 02:26:01.617] Error: required flag(s) "max" not set
W1207 02:26:01.617] 
W1207 02:26:01.617] 
W1207 02:26:01.617] Examples:
W1207 02:26:01.617]   # Auto scale a deployment "foo", with the number of pods between 2 and 10, no target CPU utilization specified so a default autoscaling policy will be used:
W1207 02:26:01.618]   kubectl autoscale deployment foo --min=2 --max=10
W1207 02:26:01.618]   
... skipping 88 lines ...
I1207 02:26:04.845] (Bapps.sh:431: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
I1207 02:26:04.949] (Bapps.sh:432: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
I1207 02:26:05.067] (Bstatefulset.apps/nginx rolled back
I1207 02:26:05.171] apps.sh:435: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
I1207 02:26:05.274] (Bapps.sh:436: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
I1207 02:26:05.391] (BSuccessful
I1207 02:26:05.391] message:error: unable to find specified revision 1000000 in history
I1207 02:26:05.391] has:unable to find specified revision
I1207 02:26:05.491] apps.sh:440: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
I1207 02:26:05.590] (Bapps.sh:441: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
I1207 02:26:05.714] (Bstatefulset.apps/nginx rolled back
I1207 02:26:05.819] apps.sh:444: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.8:
I1207 02:26:05.922] (Bapps.sh:445: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
... skipping 58 lines ...
I1207 02:26:07.897] Name:         mock
I1207 02:26:07.897] Namespace:    namespace-1544149566-19747
I1207 02:26:07.897] Selector:     app=mock
I1207 02:26:07.897] Labels:       app=mock
I1207 02:26:07.898] Annotations:  <none>
I1207 02:26:07.898] Replicas:     1 current / 1 desired
I1207 02:26:07.898] Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I1207 02:26:07.898] Pod Template:
I1207 02:26:07.898]   Labels:  app=mock
I1207 02:26:07.898]   Containers:
I1207 02:26:07.898]    mock-container:
I1207 02:26:07.898]     Image:        k8s.gcr.io/pause:2.0
I1207 02:26:07.898]     Port:         9949/TCP
... skipping 56 lines ...
I1207 02:26:10.304] Name:         mock
I1207 02:26:10.304] Namespace:    namespace-1544149566-19747
I1207 02:26:10.304] Selector:     app=mock
I1207 02:26:10.304] Labels:       app=mock
I1207 02:26:10.305] Annotations:  <none>
I1207 02:26:10.305] Replicas:     1 current / 1 desired
I1207 02:26:10.305] Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I1207 02:26:10.305] Pod Template:
I1207 02:26:10.305]   Labels:  app=mock
I1207 02:26:10.305]   Containers:
I1207 02:26:10.305]    mock-container:
I1207 02:26:10.305]     Image:        k8s.gcr.io/pause:2.0
I1207 02:26:10.305]     Port:         9949/TCP
... skipping 56 lines ...
I1207 02:26:12.663] Name:         mock
I1207 02:26:12.663] Namespace:    namespace-1544149566-19747
I1207 02:26:12.663] Selector:     app=mock
I1207 02:26:12.663] Labels:       app=mock
I1207 02:26:12.664] Annotations:  <none>
I1207 02:26:12.664] Replicas:     1 current / 1 desired
I1207 02:26:12.664] Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I1207 02:26:12.664] Pod Template:
I1207 02:26:12.664]   Labels:  app=mock
I1207 02:26:12.664]   Containers:
I1207 02:26:12.664]    mock-container:
I1207 02:26:12.664]     Image:        k8s.gcr.io/pause:2.0
I1207 02:26:12.664]     Port:         9949/TCP
... skipping 42 lines ...
I1207 02:26:14.945] Namespace:    namespace-1544149566-19747
I1207 02:26:14.945] Selector:     app=mock
I1207 02:26:14.945] Labels:       app=mock
I1207 02:26:14.945]               status=replaced
I1207 02:26:14.945] Annotations:  <none>
I1207 02:26:14.945] Replicas:     1 current / 1 desired
I1207 02:26:14.946] Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I1207 02:26:14.946] Pod Template:
I1207 02:26:14.946]   Labels:  app=mock
I1207 02:26:14.946]   Containers:
I1207 02:26:14.946]    mock-container:
I1207 02:26:14.946]     Image:        k8s.gcr.io/pause:2.0
I1207 02:26:14.946]     Port:         9949/TCP
... skipping 11 lines ...
I1207 02:26:14.956] Namespace:    namespace-1544149566-19747
I1207 02:26:14.956] Selector:     app=mock2
I1207 02:26:14.957] Labels:       app=mock2
I1207 02:26:14.957]               status=replaced
I1207 02:26:14.957] Annotations:  <none>
I1207 02:26:14.957] Replicas:     1 current / 1 desired
I1207 02:26:14.957] Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
I1207 02:26:14.957] Pod Template:
I1207 02:26:14.957]   Labels:  app=mock2
I1207 02:26:14.957]   Containers:
I1207 02:26:14.958]    mock-container:
I1207 02:26:14.958]     Image:        k8s.gcr.io/pause:2.0
I1207 02:26:14.958]     Port:         9949/TCP
... skipping 105 lines ...
I1207 02:26:20.001] +++ [1207 02:26:19] Creating namespace namespace-1544149579-9953
I1207 02:26:20.078] namespace/namespace-1544149579-9953 created
I1207 02:26:20.152] Context "test" modified.
I1207 02:26:20.159] +++ [1207 02:26:20] Testing persistent volumes
I1207 02:26:20.259] storage.sh:30: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: 
I1207 02:26:20.413] (Bpersistentvolume/pv0001 created
W1207 02:26:20.514] E1207 02:26:20.426090   55610 pv_protection_controller.go:116] PV pv0001 failed with : Operation cannot be fulfilled on persistentvolumes "pv0001": the object has been modified; please apply your changes to the latest version and try again
I1207 02:26:20.614] storage.sh:33: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0001:
I1207 02:26:20.615] (Bpersistentvolume "pv0001" deleted
I1207 02:26:20.774] persistentvolume/pv0002 created
I1207 02:26:20.881] storage.sh:36: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0002:
I1207 02:26:20.966] (Bpersistentvolume "pv0002" deleted
I1207 02:26:21.134] persistentvolume/pv0003 created
... skipping 480 lines ...
I1207 02:26:26.361] yes
I1207 02:26:26.361] has:the server doesn't have a resource type
I1207 02:26:26.441] Successful
I1207 02:26:26.441] message:yes
I1207 02:26:26.442] has:yes
I1207 02:26:26.522] Successful
I1207 02:26:26.522] message:error: --subresource can not be used with NonResourceURL
I1207 02:26:26.522] has:subresource can not be used with NonResourceURL
I1207 02:26:26.605] Successful
I1207 02:26:26.688] Successful
I1207 02:26:26.688] message:yes
I1207 02:26:26.689] 0
I1207 02:26:26.689] has:0
... skipping 6 lines ...
I1207 02:26:26.894] role.rbac.authorization.k8s.io/testing-R reconciled
I1207 02:26:26.998] legacy-script.sh:736: Successful get rolebindings -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-RB:
I1207 02:26:27.102] (Blegacy-script.sh:737: Successful get roles -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-R:
I1207 02:26:27.205] (Blegacy-script.sh:738: Successful get clusterrolebindings -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-CRB:
I1207 02:26:27.305] (Blegacy-script.sh:739: Successful get clusterroles -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-CR:
I1207 02:26:27.394] (BSuccessful
I1207 02:26:27.394] message:error: only rbac.authorization.k8s.io/v1 is supported: not *v1beta1.ClusterRole
I1207 02:26:27.394] has:only rbac.authorization.k8s.io/v1 is supported
I1207 02:26:27.496] rolebinding.rbac.authorization.k8s.io "testing-RB" deleted
I1207 02:26:27.504] role.rbac.authorization.k8s.io "testing-R" deleted
I1207 02:26:27.517] clusterrole.rbac.authorization.k8s.io "testing-CR" deleted
I1207 02:26:27.529] clusterrolebinding.rbac.authorization.k8s.io "testing-CRB" deleted
I1207 02:26:27.541] Recording: run_retrieve_multiple_tests
... skipping 893 lines ...
I1207 02:26:55.929] message:node/127.0.0.1 already uncordoned (dry run)
I1207 02:26:55.929] has:already uncordoned
I1207 02:26:56.027] node-management.sh:119: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
I1207 02:26:56.113] (Bnode/127.0.0.1 labeled
I1207 02:26:56.215] node-management.sh:124: Successful get nodes 127.0.0.1 {{.metadata.labels.test}}: label
I1207 02:26:56.287] (BSuccessful
I1207 02:26:56.287] message:error: cannot specify both a node name and a --selector option
I1207 02:26:56.287] See 'kubectl drain -h' for help and examples
I1207 02:26:56.287] has:cannot specify both a node name
I1207 02:26:56.360] Successful
I1207 02:26:56.361] message:error: USAGE: cordon NODE [flags]
I1207 02:26:56.361] See 'kubectl cordon -h' for help and examples
I1207 02:26:56.361] has:error\: USAGE\: cordon NODE
I1207 02:26:56.442] node/127.0.0.1 already uncordoned
I1207 02:26:56.519] Successful
I1207 02:26:56.520] message:error: You must provide one or more resources by argument or filename.
I1207 02:26:56.520] Example resource specifications include:
I1207 02:26:56.520]    '-f rsrc.yaml'
I1207 02:26:56.520]    '--filename=rsrc.json'
I1207 02:26:56.520]    '<resource> <name>'
I1207 02:26:56.520]    '<resource>'
I1207 02:26:56.521] has:must provide one or more resources
... skipping 15 lines ...
I1207 02:26:56.990] Successful
I1207 02:26:56.991] message:The following kubectl-compatible plugins are available:
I1207 02:26:56.991] 
I1207 02:26:56.991] test/fixtures/pkg/kubectl/plugins/version/kubectl-version
I1207 02:26:56.991]   - warning: kubectl-version overwrites existing command: "kubectl version"
I1207 02:26:56.991] 
I1207 02:26:56.991] error: one plugin warning was found
I1207 02:26:56.991] has:kubectl-version overwrites existing command: "kubectl version"
I1207 02:26:57.067] Successful
I1207 02:26:57.067] message:The following kubectl-compatible plugins are available:
I1207 02:26:57.067] 
I1207 02:26:57.068] test/fixtures/pkg/kubectl/plugins/kubectl-foo
I1207 02:26:57.068] test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo
I1207 02:26:57.068]   - warning: test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo is overshadowed by a similarly named plugin: test/fixtures/pkg/kubectl/plugins/kubectl-foo
I1207 02:26:57.068] 
I1207 02:26:57.068] error: one plugin warning was found
I1207 02:26:57.068] has:test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo is overshadowed by a similarly named plugin
I1207 02:26:57.145] Successful
I1207 02:26:57.145] message:The following kubectl-compatible plugins are available:
I1207 02:26:57.146] 
I1207 02:26:57.146] test/fixtures/pkg/kubectl/plugins/kubectl-foo
I1207 02:26:57.146] has:plugins are available
I1207 02:26:57.225] Successful
I1207 02:26:57.225] message:
I1207 02:26:57.226] error: unable to read directory "test/fixtures/pkg/kubectl/plugins/empty" in your PATH: open test/fixtures/pkg/kubectl/plugins/empty: no such file or directory
I1207 02:26:57.226] error: unable to find any kubectl plugins in your PATH
I1207 02:26:57.226] has:unable to find any kubectl plugins in your PATH
I1207 02:26:57.302] Successful
I1207 02:26:57.302] message:I am plugin foo
I1207 02:26:57.303] has:plugin foo
I1207 02:26:57.377] Successful
I1207 02:26:57.377] message:Client Version: version.Info{Major:"1", Minor:"14+", GitVersion:"v1.14.0-alpha.0.895+72d903cf0d5c7d", GitCommit:"72d903cf0d5c7db013174728826fe240ae727556", GitTreeState:"clean", BuildDate:"2018-12-07T02:19:56Z", GoVersion:"go1.11.1", Compiler:"gc", Platform:"linux/amd64"}
... skipping 9 lines ...
I1207 02:26:57.464] 
I1207 02:26:57.466] +++ Running case: test-cmd.run_impersonation_tests 
I1207 02:26:57.469] +++ working dir: /go/src/k8s.io/kubernetes
I1207 02:26:57.472] +++ command: run_impersonation_tests
I1207 02:26:57.483] +++ [1207 02:26:57] Testing impersonation
I1207 02:26:57.557] Successful
I1207 02:26:57.557] message:error: requesting groups or user-extra for  without impersonating a user
I1207 02:26:57.557] has:without impersonating a user
I1207 02:26:57.720] certificatesigningrequest.certificates.k8s.io/foo created
I1207 02:26:57.821] authorization.sh:68: Successful get csr/foo {{.spec.username}}: user1
I1207 02:26:57.916] (Bauthorization.sh:69: Successful get csr/foo {{range .spec.groups}}{{.}}{{end}}: system:authenticated
I1207 02:26:58.000] (Bcertificatesigningrequest.certificates.k8s.io "foo" deleted
I1207 02:26:58.172] certificatesigningrequest.certificates.k8s.io/foo created
... skipping 113 lines ...
W1207 02:26:58.756] I1207 02:26:58.735832   52262 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1207 02:26:58.756] I1207 02:26:58.735929   52262 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1207 02:26:58.756] I1207 02:26:58.735882   52262 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1207 02:26:58.757] I1207 02:26:58.736072   52262 picker_wrapper.go:218] blockingPicker: the picked transport is not ready, loop back to repick
W1207 02:26:58.757] I1207 02:26:58.736089   52262 picker_wrapper.go:218] blockingPicker: the picked transport is not ready, loop back to repick
W1207 02:26:58.757] I1207 02:26:58.736104   52262 picker_wrapper.go:218] blockingPicker: the picked transport is not ready, loop back to repick
W1207 02:26:58.757] W1207 02:26:58.737292   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.758] W1207 02:26:58.737575   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.758] W1207 02:26:58.736291   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.758] W1207 02:26:58.736336   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.758] W1207 02:26:58.736371   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.759] I1207 02:26:58.737636   52262 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1207 02:26:58.759] W1207 02:26:58.736386   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.759] W1207 02:26:58.736409   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.759] W1207 02:26:58.736456   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.760] W1207 02:26:58.736500   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.760] W1207 02:26:58.736554   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.760] W1207 02:26:58.736557   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.760] W1207 02:26:58.736633   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.761] W1207 02:26:58.736673   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.761] W1207 02:26:58.736729   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.761] W1207 02:26:58.736758   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.762] W1207 02:26:58.736767   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.762] I1207 02:26:58.736781   52262 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1207 02:26:58.762] W1207 02:26:58.736804   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.762] W1207 02:26:58.736805   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.763] I1207 02:26:58.736811   52262 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1207 02:26:58.763] I1207 02:26:58.736818   52262 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1207 02:26:58.763] I1207 02:26:58.736839   52262 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1207 02:26:58.763] W1207 02:26:58.736842   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.764] I1207 02:26:58.736855   52262 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1207 02:26:58.764] I1207 02:26:58.736873   52262 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1207 02:26:58.764] W1207 02:26:58.736875   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.764] W1207 02:26:58.736916   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.765] W1207 02:26:58.736919   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.765] W1207 02:26:58.736936   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.765] I1207 02:26:58.736943   52262 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1207 02:26:58.765] W1207 02:26:58.736944   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.765] W1207 02:26:58.736969   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.766] W1207 02:26:58.736977   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.766] W1207 02:26:58.736981   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.766] W1207 02:26:58.737008   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.766] W1207 02:26:58.737008   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.767] W1207 02:26:58.737017   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.767] W1207 02:26:58.737046   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.767] W1207 02:26:58.737049   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.767] W1207 02:26:58.737045   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.768] W1207 02:26:58.737048   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.768] W1207 02:26:58.737070   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.768] W1207 02:26:58.737080   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.769] W1207 02:26:58.737094   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.769] W1207 02:26:58.737097   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.769] W1207 02:26:58.737104   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.769] W1207 02:26:58.737111   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.770] W1207 02:26:58.737161   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.770] W1207 02:26:58.737169   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.770] W1207 02:26:58.737165   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.770] W1207 02:26:58.737202   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.771] W1207 02:26:58.737204   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.771] W1207 02:26:58.737210   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.771] W1207 02:26:58.737220   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.771] I1207 02:26:58.737240   52262 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1207 02:26:58.771] W1207 02:26:58.737245   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.772] W1207 02:26:58.737248   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.772] I1207 02:26:58.737271   52262 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1207 02:26:58.772] W1207 02:26:58.737275   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.772] I1207 02:26:58.737295   52262 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1207 02:26:58.772] I1207 02:26:58.737324   52262 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1207 02:26:58.773] I1207 02:26:58.737347   52262 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1207 02:26:58.773] W1207 02:26:58.737349   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.773] W1207 02:26:58.737361   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.773] I1207 02:26:58.737370   52262 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1207 02:26:58.774] W1207 02:26:58.737394   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.774] W1207 02:26:58.737402   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.774] W1207 02:26:58.737408   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.774] W1207 02:26:58.737447   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.775] I1207 02:26:58.737451   52262 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1207 02:26:58.775] W1207 02:26:58.737448   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.775] W1207 02:26:58.737473   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.775] I1207 02:26:58.737480   52262 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1207 02:26:58.776] W1207 02:26:58.737502   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.776] W1207 02:26:58.737511   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.776] W1207 02:26:58.737532   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.776] W1207 02:26:58.737532   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.777] W1207 02:26:58.737560   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.777] W1207 02:26:58.737647   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.777] I1207 02:26:58.737669   52262 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1207 02:26:58.777] W1207 02:26:58.737684   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.778] I1207 02:26:58.737724   52262 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1207 02:26:58.778] W1207 02:26:58.737763   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.778] I1207 02:26:58.737771   52262 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1207 02:26:58.778] I1207 02:26:58.737880   52262 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1207 02:26:58.779] W1207 02:26:58.737971   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.779] I1207 02:26:58.737991   52262 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1207 02:26:58.779] W1207 02:26:58.738107   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:58.779] I1207 02:26:58.738373   52262 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1207 02:26:58.779] I1207 02:26:58.738463   52262 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1207 02:26:58.780] I1207 02:26:58.738483   52262 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1207 02:26:58.780] I1207 02:26:58.738657   52262 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1207 02:26:58.780] I1207 02:26:58.738685   52262 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
W1207 02:26:58.780] I1207 02:26:58.738846   52262 balancer_v1_wrapper.go:125] balancerWrapper: got update addr from Notify: [{127.0.0.1:2379 <nil>}]
... skipping 11 lines ...
W1207 02:26:58.782] + make test-integration
I1207 02:26:58.883] No resources found
I1207 02:26:58.883] pod "test-pod-1" force deleted
I1207 02:26:58.883] +++ [1207 02:26:58] TESTS PASSED
I1207 02:26:58.883] junit report dir: /workspace/artifacts
I1207 02:26:58.883] +++ [1207 02:26:58] Clean up complete
W1207 02:26:59.734] W1207 02:26:59.733402   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:59.735] W1207 02:26:59.733731   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:59.735] W1207 02:26:59.733795   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:59.735] W1207 02:26:59.733816   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:59.735] W1207 02:26:59.733853   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:59.736] W1207 02:26:59.733944   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:59.736] W1207 02:26:59.733988   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:59.736] W1207 02:26:59.734059   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:59.737] W1207 02:26:59.734111   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:59.737] W1207 02:26:59.734124   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:59.737] W1207 02:26:59.734158   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:59.737] W1207 02:26:59.734194   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:59.737] W1207 02:26:59.734199   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:59.738] W1207 02:26:59.734228   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:59.738] W1207 02:26:59.734231   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:59.738] W1207 02:26:59.734451   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:59.738] W1207 02:26:59.734517   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:59.739] W1207 02:26:59.734550   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:59.739] W1207 02:26:59.734553   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:59.739] W1207 02:26:59.734559   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:59.739] W1207 02:26:59.734560   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:59.740] W1207 02:26:59.734610   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:59.740] W1207 02:26:59.734613   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:59.740] W1207 02:26:59.734617   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:59.740] W1207 02:26:59.734663   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:59.741] W1207 02:26:59.734673   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:59.741] W1207 02:26:59.734771   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:59.741] W1207 02:26:59.734793   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:59.741] W1207 02:26:59.734830   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:59.741] W1207 02:26:59.734867   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:59.742] W1207 02:26:59.734948   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:59.742] W1207 02:26:59.735001   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:59.742] W1207 02:26:59.735037   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:59.742] W1207 02:26:59.735070   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:59.743] E1207 02:26:59.735105   52262 controller.go:172] Get https://127.0.0.1:6443/api/v1/namespaces/default/endpoints/kubernetes: dial tcp 127.0.0.1:6443: connect: connection refused
W1207 02:26:59.743] W1207 02:26:59.735148   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1207 02:26:59.743] W1207 02:26:59.735240   52262 clientconn.go:1304] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I1207 02:27:03.484] +++ [1207 02:27:03] Checking etcd is on PATH
I1207 02:27:03.485] /workspace/kubernetes/third_party/etcd/etcd
I1207 02:27:03.489] +++ [1207 02:27:03] Starting etcd instance
I1207 02:27:03.545] etcd --advertise-client-urls http://127.0.0.1:2379 --data-dir /tmp/tmp.8I5HTogGLU --listen-client-urls http://127.0.0.1:2379 --debug > "/workspace/artifacts/etcd.ccdffca7b1ae.root.log.DEBUG.20181207-022703.93879" 2>/dev/null
I1207 02:27:03.546] Waiting for etcd to come up.
I1207 02:27:03.988] +++ [1207 02:27:03] On try 2, etcd: : http://127.0.0.1:2379
... skipping 4 lines ...
I1207 02:30:42.550] ok  	k8s.io/kubernetes/test/integration/apimachinery	168.884s
I1207 02:30:42.551] ok  	k8s.io/kubernetes/test/integration/apiserver	37.702s
I1207 02:30:42.552] [restful] 2018/12/07 02:29:38 log.go:33: [restful/swagger] listing is available at https://127.0.0.1:36361/swaggerapi
I1207 02:30:42.552] [restful] 2018/12/07 02:29:38 log.go:33: [restful/swagger] https://127.0.0.1:36361/swaggerui/ is mapped to folder /swagger-ui/
I1207 02:30:42.552] [restful] 2018/12/07 02:29:40 log.go:33: [restful/swagger] listing is available at https://127.0.0.1:36361/swaggerapi
I1207 02:30:42.553] [restful] 2018/12/07 02:29:40 log.go:33: [restful/swagger] https://127.0.0.1:36361/swaggerui/ is mapped to folder /swagger-ui/
I1207 02:30:42.553] FAIL	k8s.io/kubernetes/test/integration/auth	93.879s
I1207 02:30:42.553] [restful] 2018/12/07 02:28:32 log.go:33: [restful/swagger] listing is available at https://127.0.0.1:33559/swaggerapi
I1207 02:30:42.553] [restful] 2018/12/07 02:28:32 log.go:33: [restful/swagger] https://127.0.0.1:33559/swaggerui/ is mapped to folder /swagger-ui/
I1207 02:30:42.554] [restful] 2018/12/07 02:28:34 log.go:33: [restful/swagger] listing is available at https://127.0.0.1:33559/swaggerapi
I1207 02:30:42.554] [restful] 2018/12/07 02:28:34 log.go:33: [restful/swagger] https://127.0.0.1:33559/swaggerui/ is mapped to folder /swagger-ui/
I1207 02:30:42.554] [restful] 2018/12/07 02:28:42 log.go:33: [restful/swagger] listing is available at https://127.0.0.1:44251/swaggerapi
I1207 02:30:42.554] [restful] 2018/12/07 02:28:42 log.go:33: [restful/swagger] https://127.0.0.1:44251/swaggerui/ is mapped to folder /swagger-ui/
... skipping 224 lines ...
I1207 02:39:33.144] [restful] 2018/12/07 02:32:57 log.go:33: [restful/swagger] https://127.0.0.1:38983/swaggerui/ is mapped to folder /swagger-ui/
I1207 02:39:33.144] ok  	k8s.io/kubernetes/test/integration/tls	13.008s
I1207 02:39:33.145] ok  	k8s.io/kubernetes/test/integration/ttlcontroller	11.248s
I1207 02:39:33.145] ok  	k8s.io/kubernetes/test/integration/volume	95.482s
I1207 02:39:33.145] ok  	k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/test/integration	144.279s
I1207 02:39:34.626] +++ [1207 02:39:34] Saved JUnit XML test report to /workspace/artifacts/junit_f5a444384056ebac4f2929ce7b7920ea9733ca19_20181207-022709.xml
I1207 02:39:34.629] Makefile:184: recipe for target 'test' failed
I1207 02:39:34.641] +++ [1207 02:39:34] Cleaning up etcd
W1207 02:39:34.742] make[1]: *** [test] Error 1
W1207 02:39:34.742] !!! [1207 02:39:34] Call tree:
W1207 02:39:34.742] !!! [1207 02:39:34]  1: hack/make-rules/test-integration.sh:105 runTests(...)
W1207 02:39:34.841] make: *** [test-integration] Error 1
I1207 02:39:34.942] +++ [1207 02:39:34] Integration test cleanup complete
I1207 02:39:34.942] Makefile:203: recipe for target 'test-integration' failed
W1207 02:39:36.062] Traceback (most recent call last):
W1207 02:39:36.062]   File "/workspace/./test-infra/jenkins/../scenarios/kubernetes_verify.py", line 167, in <module>
W1207 02:39:36.062]     main(ARGS.branch, ARGS.script, ARGS.force, ARGS.prow)
W1207 02:39:36.063]   File "/workspace/./test-infra/jenkins/../scenarios/kubernetes_verify.py", line 136, in main
W1207 02:39:36.063]     check(*cmd)
W1207 02:39:36.063]   File "/workspace/./test-infra/jenkins/../scenarios/kubernetes_verify.py", line 48, in check
W1207 02:39:36.063]     subprocess.check_call(cmd)
W1207 02:39:36.063]   File "/usr/lib/python2.7/subprocess.py", line 540, in check_call
W1207 02:39:36.090]     raise CalledProcessError(retcode, cmd)
W1207 02:39:36.091] subprocess.CalledProcessError: Command '('docker', 'run', '--rm=true', '--privileged=true', '-v', '/var/run/docker.sock:/var/run/docker.sock', '-v', '/etc/localtime:/etc/localtime:ro', '-v', '/workspace/k8s.io/kubernetes:/go/src/k8s.io/kubernetes', '-v', '/workspace/k8s.io/:/workspace/k8s.io/', '-v', '/workspace/_artifacts:/workspace/artifacts', '-e', 'KUBE_FORCE_VERIFY_CHECKS=n', '-e', 'KUBE_VERIFY_GIT_BRANCH=master', '-e', 'REPO_DIR=/workspace/k8s.io/kubernetes', '--tmpfs', '/tmp:exec,mode=1777', 'gcr.io/k8s-testimages/kubekins-test:1.13-v20181105-ceed87206', 'bash', '-c', 'cd kubernetes && ./hack/jenkins/test-dockerized.sh')' returned non-zero exit status 2
E1207 02:39:36.099] Command failed
I1207 02:39:36.099] process 698 exited with code 1 after 25.7m
E1207 02:39:36.100] FAIL: pull-kubernetes-integration
I1207 02:39:36.100] Call:  gcloud auth activate-service-account --key-file=/etc/service-account/service-account.json
W1207 02:39:36.613] Activated service account credentials for: [pr-kubekins@kubernetes-jenkins-pull.iam.gserviceaccount.com]
I1207 02:39:36.665] process 123786 exited with code 0 after 0.0m
I1207 02:39:36.666] Call:  gcloud config get-value account
I1207 02:39:36.966] process 123799 exited with code 0 after 0.0m
I1207 02:39:36.967] Will upload results to gs://kubernetes-jenkins/pr-logs using pr-kubekins@kubernetes-jenkins-pull.iam.gserviceaccount.com
I1207 02:39:36.967] Upload result and artifacts...
I1207 02:39:36.967] Gubernator results at https://gubernator.k8s.io/build/kubernetes-jenkins/pr-logs/pull/71764/pull-kubernetes-integration/37814
I1207 02:39:36.968] Call:  gsutil ls gs://kubernetes-jenkins/pr-logs/pull/71764/pull-kubernetes-integration/37814/artifacts
W1207 02:39:38.924] CommandException: One or more URLs matched no objects.
E1207 02:39:39.153] Command failed
I1207 02:39:39.153] process 123812 exited with code 1 after 0.0m
W1207 02:39:39.153] Remote dir gs://kubernetes-jenkins/pr-logs/pull/71764/pull-kubernetes-integration/37814/artifacts not exist yet
I1207 02:39:39.153] Call:  gsutil -m -q -o GSUtil:use_magicfile=True cp -r -c -z log,txt,xml /workspace/_artifacts gs://kubernetes-jenkins/pr-logs/pull/71764/pull-kubernetes-integration/37814/artifacts
I1207 02:39:43.169] process 123957 exited with code 0 after 0.1m
W1207 02:39:43.170] metadata path /workspace/_artifacts/metadata.json does not exist
W1207 02:39:43.170] metadata not found or invalid, init with empty metadata
... skipping 23 lines ...