This job view page is being replaced by Spyglass soon. Check out the new job view.
PRabserari: fix-todo-in-pkg/util/taints
ResultFAILURE
Tests 1 failed / 2968 succeeded
Started2020-10-17 10:32
Elapsed28m53s
Revision6ad5b5120db6daf1b929fd8cdc0febcb23822dcc
Refs 95656

Test Failures


k8s.io/kubernetes/test/integration/apiserver/apply TestApplyCRDNonStructuralSchema 11s

go test -v k8s.io/kubernetes/test/integration/apiserver/apply -run TestApplyCRDNonStructuralSchema$
=== RUN   TestApplyCRDNonStructuralSchema
E1017 10:49:59.827160  116821 controller.go:184] Get "https://127.0.0.1:41061/api/v1/namespaces/default/endpoints/kubernetes": dial tcp 127.0.0.1:41061: connect: connection refused
    testserver.go:312: Resolved testserver package path to: "/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/cmd/kube-apiserver/app/testing"
I1017 10:50:01.271263  116821 serving.go:325] Generated self-signed cert (/tmp/kubernetes-kube-apiserver405427394/apiserver.crt, /tmp/kubernetes-kube-apiserver405427394/apiserver.key)
I1017 10:50:01.271375  116821 server.go:625] external host was not specified, using 127.0.0.1
W1017 10:50:01.271422  116821 authentication.go:504] AnonymousAuth is not allowed with the AlwaysAllow authorizer. Resetting AnonymousAuth to false. You should use a different authorizer
    testserver.go:183: runtime-config=map[api/all:true]
    testserver.go:184: Starting kube-apiserver on port 40921...
W1017 10:50:02.242129  116821 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W1017 10:50:02.242570  116821 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W1017 10:50:02.242608  116821 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W1017 10:50:02.243511  116821 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W1017 10:50:02.244717  116821 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W1017 10:50:02.244764  116821 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W1017 10:50:02.244807  116821 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W1017 10:50:02.244833  116821 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W1017 10:50:02.244878  116821 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W1017 10:50:02.244923  116821 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W1017 10:50:02.245203  116821 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W1017 10:50:02.245448  116821 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W1017 10:50:02.245529  116821 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
I1017 10:50:02.245550  116821 plugins.go:158] Loaded 11 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,ServiceAccount,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,RuntimeClass,DefaultIngressClass,MutatingAdmissionWebhook.
I1017 10:50:02.245563  116821 plugins.go:161] Loaded 10 validating admission controller(s) successfully in the following order: LimitRanger,ServiceAccount,Priority,PersistentVolumeClaimResize,RuntimeClass,CertificateApproval,CertificateSigning,CertificateSubjectRestriction,ValidatingAdmissionWebhook,ResourceQuota.
I1017 10:50:02.247327  116821 plugins.go:158] Loaded 11 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,ServiceAccount,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,RuntimeClass,DefaultIngressClass,MutatingAdmissionWebhook.
I1017 10:50:02.247358  116821 plugins.go:161] Loaded 10 validating admission controller(s) successfully in the following order: LimitRanger,ServiceAccount,Priority,PersistentVolumeClaimResize,RuntimeClass,CertificateApproval,CertificateSigning,CertificateSubjectRestriction,ValidatingAdmissionWebhook,ResourceQuota.
I1017 10:50:02.255903  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.255949  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.256898  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.256929  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
W1017 10:50:02.296983  116821 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
I1017 10:50:02.298440  116821 instance.go:271] Using reconciler: lease
I1017 10:50:02.298781  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.298814  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.303507  116821 instance.go:376] Could not construct pre-rendered responses for ServiceAccountIssuerDiscovery endpoints. Endpoints will not be enabled. Error: empty issuer URL
I1017 10:50:02.303943  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.303979  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.305215  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.305251  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.307156  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.307191  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.308670  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.308701  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.309569  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.309596  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.310685  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.310727  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.311767  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.311806  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.312994  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.313035  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.319362  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.319402  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.321907  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.321934  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.323310  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.323345  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.324167  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.324205  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.325890  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.325918  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.327242  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.327268  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.328655  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.328680  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.329754  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.329791  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.331093  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.331126  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.653638  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.653692  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.658352  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.658498  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.660058  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.660094  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.661893  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.661929  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.663073  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.663105  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.666471  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.666498  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.667800  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.667829  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.668916  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.668943  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.670637  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.670668  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.672359  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.672379  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.674003  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.674138  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.674973  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.675161  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.676081  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.676119  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.677438  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.677460  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.678831  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.678853  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.680208  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.680230  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.681954  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.681980  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.683476  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.683499  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.685349  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.685375  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.686335  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.686357  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.707807  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.707840  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.720431  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.720470  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.742628  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.742673  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.748263  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.748295  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.752691  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.752737  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.756056  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.756122  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.757245  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.757278  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.758415  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.758451  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.760995  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.761030  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.762449  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.762544  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.763373  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.763404  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.764597  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.764627  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.765711  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.765750  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.767474  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.767497  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.770035  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.770072  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.771652  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.771689  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.773815  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.773839  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.781141  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.781175  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.782256  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.782283  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.784961  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.784994  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.787826  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.787861  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.790028  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.790062  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.791143  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.791165  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.792231  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.792263  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.793980  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.794025  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.794862  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.794890  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.796327  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.796360  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.797394  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.797421  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.798465  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.798504  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.799699  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.799732  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.802120  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.802152  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.804825  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.804853  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.806111  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.806140  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.807390  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.807413  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.808836  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.808873  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.811029  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.811100  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.812038  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.812084  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.813032  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.813062  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.813953  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.813984  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
W1017 10:50:03.139763  116821 genericapiserver.go:412] Skipping API discovery.k8s.io/v1alpha1 because it has no resources.
I1017 10:50:03.240962  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:03.241043  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
W1017 10:50:03.422702  116821 genericapiserver.go:412] Skipping API apps/v1beta2 because it has no resources.
W1017 10:50:03.422748  116821 genericapiserver.go:412] Skipping API apps/v1beta1 because it has no resources.
I1017 10:50:03.447990  116821 plugins.go:158] Loaded 11 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,ServiceAccount,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,RuntimeClass,DefaultIngressClass,MutatingAdmissionWebhook.
I1017 10:50:03.448023  116821 plugins.go:161] Loaded 10 validating admission controller(s) successfully in the following order: LimitRanger,ServiceAccount,Priority,PersistentVolumeClaimResize,RuntimeClass,CertificateApproval,CertificateSigning,CertificateSubjectRestriction,ValidatingAdmissionWebhook,ResourceQuota.
W1017 10:50:03.453261  116821 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
I1017 10:50:03.453541  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:03.453584  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:03.454471  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:03.454511  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
W1017 10:50:03.458246  116821 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
    testserver.go:200: Waiting for /healthz to be ok...
I1017 10:50:08.392453  116821 dynamic_cafile_content.go:167] Starting request-header::/tmp/kubernetes-kube-apiserver405427394/proxy-ca.crt
I1017 10:50:08.392484  116821 dynamic_cafile_content.go:167] Starting client-ca-bundle::/tmp/kubernetes-kube-apiserver405427394/client-ca.crt
I1017 10:50:08.393106  116821 dynamic_serving_content.go:130] Starting serving-cert::/tmp/kubernetes-kube-apiserver405427394/apiserver.crt::/tmp/kubernetes-kube-apiserver405427394/apiserver.key
I1017 10:50:08.393887  116821 secure_serving.go:197] Serving securely on 127.0.0.1:40921
I1017 10:50:08.393952  116821 tlsconfig.go:240] Starting DynamicServingCertificateController
W1017 10:50:08.395461  116821 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
I1017 10:50:08.395693  116821 cluster_authentication_trust_controller.go:440] Starting cluster_authentication_trust_controller controller
I1017 10:50:08.395712  116821 shared_informer.go:240] Waiting for caches to sync for cluster_authentication_trust_controller
I1017 10:50:08.395773  116821 controller.go:83] Starting OpenAPI AggregationController
I1017 10:50:08.395830  116821 customresource_discovery_controller.go:209] Starting DiscoveryController
I1017 10:50:08.396561  116821 apiservice_controller.go:97] Starting APIServiceRegistrationController
I1017 10:50:08.396592  116821 cache.go:32] Waiting for caches to sync for APIServiceRegistrationController controller
I1017 10:50:08.396623  116821 available_controller.go:457] Starting AvailableConditionController
I1017 10:50:08.396629  116821 cache.go:32] Waiting for caches to sync for AvailableConditionController controller
I1017 10:50:08.396728  116821 autoregister_controller.go:141] Starting autoregister controller
I1017 10:50:08.396735  116821 cache.go:32] Waiting for caches to sync for autoregister controller
I1017 10:50:08.402386  116821 dynamic_cafile_content.go:167] Starting client-ca-bundle::/tmp/kubernetes-kube-apiserver405427394/client-ca.crt
I1017 10:50:08.402745  116821 dynamic_cafile_content.go:167] Starting request-header::/tmp/kubernetes-kube-apiserver405427394/proxy-ca.crt
I1017 10:50:08.403277  116821 controller.go:86] Starting OpenAPI controller
I1017 10:50:08.403467  116821 naming_controller.go:291] Starting NamingConditionController
I1017 10:50:08.403562  116821 establishing_controller.go:76] Starting EstablishingController
I1017 10:50:08.403664  116821 nonstructuralschema_controller.go:192] Starting NonStructuralSchemaConditionController
I1017 10:50:08.403771  116821 apiapproval_controller.go:186] Starting KubernetesAPIApprovalPolicyConformantConditionController
I1017 10:50:08.403852  116821 crd_finalizer.go:266] Starting CRDFinalizer
I1017 10:50:08.403988  116821 crdregistration_controller.go:111] Starting crd-autoregister controller
I1017 10:50:08.404059  116821 shared_informer.go:240] Waiting for caches to sync for crd-autoregister
E1017 10:50:08.427904  116821 controller.go:152] Unable to remove old endpoints from kubernetes service: StorageError: key not found, Code: 1, Key: /cb28e167-3fd3-4162-b5e2-3cafe05c603a/registry/masterleases/127.0.0.1, ResourceVersion: 0, AdditionalErrorMsg: 
W1017 10:50:08.430254  116821 warnings.go:70] node.k8s.io/v1beta1 RuntimeClass is deprecated in v1.22+, unavailable in v1.25+
W1017 10:50:08.450592  116821 warnings.go:70] node.k8s.io/v1beta1 RuntimeClass is deprecated in v1.22+, unavailable in v1.25+
I1017 10:50:08.476663  116821 controller.go:606] quota admission added evaluator for: namespaces
I1017 10:50:08.495821  116821 shared_informer.go:247] Caches are synced for cluster_authentication_trust_controller 
I1017 10:50:08.496727  116821 cache.go:39] Caches are synced for AvailableConditionController controller
I1017 10:50:08.496846  116821 cache.go:39] Caches are synced for APIServiceRegistrationController controller
I1017 10:50:08.496762  116821 cache.go:39] Caches are synced for autoregister controller
I1017 10:50:08.504158  116821 shared_informer.go:247] Caches are synced for crd-autoregister 
I1017 10:50:09.392455  116821 controller.go:132] OpenAPI AggregationController: action for item : Nothing (removed from the queue).
I1017 10:50:09.392492  116821 controller.go:132] OpenAPI AggregationController: action for item k8s_internal_local_delegation_chain_0000000000: Nothing (removed from the queue).
I1017 10:50:09.403998  116821 storage_scheduling.go:134] created PriorityClass system-node-critical with value 2000001000
I1017 10:50:09.410178  116821 storage_scheduling.go:134] created PriorityClass system-cluster-critical with value 2000000000
I1017 10:50:09.410210  116821 storage_scheduling.go:143] all system priority classes are created successfully or already exist.
W1017 10:50:09.503770  116821 lease.go:233] Resetting endpoints for master service "kubernetes" to [127.0.0.1]
E1017 10:50:09.505951  116821 controller.go:223] unable to sync kubernetes service: Endpoints "kubernetes" is invalid: subsets[0].addresses[0].ip: Invalid value: "127.0.0.1": may not be in the loopback range (127.0.0.0/8)
W1017 10:50:09.581128  116821 warnings.go:70] apiextensions.k8s.io/v1beta1 CustomResourceDefinition is deprecated in v1.16+, unavailable in v1.22+; use apiextensions.k8s.io/v1 CustomResourceDefinition
I1017 10:50:10.124938  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:10.124983  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
W1017 10:50:10.226090  116821 watcher.go:220] watch chan error: etcdserver: mvcc: required revision has been compacted
W1017 10:50:11.227266  116821 cacher.go:148] Terminating all watchers from cacher *unstructured.Unstructured
    apply_crd_test.go:525: expected DELETE, but got watch.Event{Type:"", Object:runtime.Object(nil)}
W1017 10:50:11.227927  116821 cacher.go:148] Terminating all watchers from cacher *apiextensions.CustomResourceDefinition
W1017 10:50:11.228438  116821 cacher.go:148] Terminating all watchers from cacher *core.LimitRange
W1017 10:50:11.228657  116821 cacher.go:148] Terminating all watchers from cacher *core.ResourceQuota
W1017 10:50:11.228854  116821 cacher.go:148] Terminating all watchers from cacher *core.Secret
W1017 10:50:11.229367  116821 cacher.go:148] Terminating all watchers from cacher *core.ConfigMap
W1017 10:50:11.229596  116821 cacher.go:148] Terminating all watchers from cacher *core.Namespace
W1017 10:50:11.229788  116821 cacher.go:148] Terminating all watchers from cacher *core.Endpoints
W1017 10:50:11.230178  116821 cacher.go:148] Terminating all watchers from cacher *core.Pod
W1017 10:50:11.230388  116821 cacher.go:148] Terminating all watchers from cacher *core.ServiceAccount
W1017 10:50:11.230854  116821 cacher.go:148] Terminating all watchers from cacher *core.Service
W1017 10:50:11.232516  116821 cacher.go:148] Terminating all watchers from cacher *networking.IngressClass
W1017 10:50:11.232989  116821 cacher.go:148] Terminating all watchers from cacher *node.RuntimeClass
W1017 10:50:11.236922  116821 cacher.go:148] Terminating all watchers from cacher *scheduling.PriorityClass
W1017 10:50:11.237636  116821 cacher.go:148] Terminating all watchers from cacher *storage.StorageClass
W1017 10:50:11.238976  116821 cacher.go:148] Terminating all watchers from cacher *admissionregistration.ValidatingWebhookConfiguration
W1017 10:50:11.239086  116821 cacher.go:148] Terminating all watchers from cacher *admissionregistration.MutatingWebhookConfiguration
W1017 10:50:11.239726  116821 cacher.go:148] Terminating all watchers from cacher *apiregistration.APIService
--- FAIL: TestApplyCRDNonStructuralSchema (11.42s)

				from junit_20201017-104718.xml

Filter through log files | View test history on testgrid


Show 2968 Passed Tests

Show 25 Skipped Tests

Error lines from build-log.txt

... skipping 61 lines ...
Recording: record_command_canary
Running command: record_command_canary

+++ Running case: test-cmd.record_command_canary 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: record_command_canary
/home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh: line 155: bogus-expected-to-fail: command not found
!!! [1017 10:36:20] Call tree:
!!! [1017 10:36:20]  1: /home/prow/go/src/k8s.io/kubernetes/test/cmd/../../third_party/forked/shell2junit/sh2ju.sh:47 record_command_canary(...)
!!! [1017 10:36:20]  2: /home/prow/go/src/k8s.io/kubernetes/test/cmd/../../third_party/forked/shell2junit/sh2ju.sh:112 eVal(...)
!!! [1017 10:36:20]  3: /home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh:131 juLog(...)
!!! [1017 10:36:20]  4: /home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh:159 record_command(...)
!!! [1017 10:36:20]  5: hack/make-rules/test-cmd.sh:35 source(...)
+++ exit code: 1
+++ error: 1
+++ [1017 10:36:20] Running kubeadm tests
+++ [1017 10:36:25] Building go targets for linux/amd64:
    cmd/kubeadm
+++ [1017 10:37:07] Running tests without code coverage
{"Time":"2020-10-17T10:38:34.695675091Z","Action":"output","Package":"k8s.io/kubernetes/cmd/kubeadm/test/cmd","Output":"ok  \tk8s.io/kubernetes/cmd/kubeadm/test/cmd\t51.410s\n"}
✓  cmd/kubeadm/test/cmd (51.414s)
... skipping 124 lines ...
I1017 10:40:15.420823   54413 client.go:360] parsed scheme: "endpoint"
I1017 10:40:15.420861   54413 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
W1017 10:40:15.453420   54413 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
I1017 10:40:15.454600   54413 instance.go:271] Using reconciler: lease
I1017 10:40:15.454882   54413 client.go:360] parsed scheme: "endpoint"
I1017 10:40:15.454916   54413 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:40:15.457100   54413 instance.go:376] Could not construct pre-rendered responses for ServiceAccountIssuerDiscovery endpoints. Endpoints will not be enabled. Error: empty issuer URL
I1017 10:40:15.457508   54413 client.go:360] parsed scheme: "endpoint"
I1017 10:40:15.457539   54413 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:40:15.460247   54413 client.go:360] parsed scheme: "endpoint"
I1017 10:40:15.460278   54413 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:40:15.461777   54413 client.go:360] parsed scheme: "endpoint"
I1017 10:40:15.461815   54413 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
... skipping 185 lines ...
+++ [1017 10:40:20] Building kube-controller-manager
+++ [1017 10:40:24] Building go targets for linux/amd64:
    cmd/kube-controller-manager
+++ [1017 10:40:49] Starting controller-manager
Flag --port has been deprecated, see --secure-port instead.
I1017 10:40:50.520140   57975 serving.go:331] Generated self-signed cert in-memory
W1017 10:40:51.050643   57975 authentication.go:368] failed to read in-cluster kubeconfig for delegated authentication: open /var/run/secrets/kubernetes.io/serviceaccount/token: no such file or directory
W1017 10:40:51.050698   57975 authentication.go:265] No authentication-kubeconfig provided in order to lookup client-ca-file in configmap/extension-apiserver-authentication in kube-system, so client certificate authentication won't work.
W1017 10:40:51.050706   57975 authentication.go:289] No authentication-kubeconfig provided in order to lookup requestheader-client-ca-file in configmap/extension-apiserver-authentication in kube-system, so request-header client certificate authentication won't work.
W1017 10:40:51.050759   57975 authorization.go:177] failed to read in-cluster kubeconfig for delegated authorization: open /var/run/secrets/kubernetes.io/serviceaccount/token: no such file or directory
W1017 10:40:51.050777   57975 authorization.go:146] No authorization-kubeconfig provided, so SubjectAccessReview of authorization tokens won't work.
I1017 10:40:51.050796   57975 controllermanager.go:175] Version: v1.20.0-alpha.2.160+1df2311f7f64a8
I1017 10:40:51.052253   57975 secure_serving.go:197] Serving securely on [::]:10257
I1017 10:40:51.052289   57975 tlsconfig.go:240] Starting DynamicServingCertificateController
I1017 10:40:51.052923   57975 deprecated_insecure_serving.go:53] Serving insecurely on [::]:10252
I1017 10:40:51.052978   57975 leaderelection.go:243] attempting to acquire leader lease kube-system/kube-controller-manager...
... skipping 49 lines ...
W1017 10:40:51.458006   57975 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W1017 10:40:51.458018   57975 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
I1017 10:40:51.458028   57975 controllermanager.go:549] Started "disruption"
I1017 10:40:51.458384   57975 controllermanager.go:549] Started "replicaset"
W1017 10:40:51.458671   57975 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
I1017 10:40:51.458739   57975 controllermanager.go:549] Started "statefulset"
E1017 10:40:51.459037   57975 core.go:91] Failed to start service controller: WARNING: no cloud provider provided, services of type LoadBalancer will fail
W1017 10:40:51.459049   57975 controllermanager.go:541] Skipping "service"
I1017 10:40:51.459579   57975 controllermanager.go:549] Started "persistentvolume-binder"
W1017 10:40:51.459876   57975 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W1017 10:40:51.459904   57975 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W1017 10:40:51.459914   57975 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W1017 10:40:51.460018   57975 probe.go:268] Flexvolume plugin directory at /usr/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating.
... skipping 82 lines ...
I1017 10:40:51.818688   57975 controllermanager.go:549] Started "nodelifecycle"
I1017 10:40:51.818708   57975 core.go:241] Will not configure cloud provider routes for allocate-node-cidrs: false, configure-cloud-routes: true.
W1017 10:40:51.818716   57975 controllermanager.go:541] Skipping "route"
I1017 10:40:51.818861   57975 node_lifecycle_controller.go:542] Starting node controller
I1017 10:40:51.818885   57975 shared_informer.go:240] Waiting for caches to sync for taint
I1017 10:40:51.819008   57975 node_lifecycle_controller.go:77] Sending events to api server
E1017 10:40:51.819045   57975 core.go:231] failed to start cloud node lifecycle controller: no cloud provider provided
W1017 10:40:51.819055   57975 controllermanager.go:541] Skipping "cloud-node-lifecycle"
I1017 10:40:51.819452   57975 controllermanager.go:549] Started "pvc-protection"
I1017 10:40:51.819492   57975 pvc_protection_controller.go:110] Starting PVC protection controller
I1017 10:40:51.819506   57975 shared_informer.go:240] Waiting for caches to sync for PVC protection
I1017 10:40:51.820010   57975 controllermanager.go:549] Started "endpoint"
I1017 10:40:51.820133   57975 endpoints_controller.go:184] Starting endpoint controller
... skipping 6 lines ...
I1017 10:40:51.821280   57975 controllermanager.go:549] Started "endpointslice"
I1017 10:40:51.821319   57975 endpointslice_controller.go:237] Starting endpoint slice controller
I1017 10:40:51.821479   57975 shared_informer.go:240] Waiting for caches to sync for endpoint_slice
I1017 10:40:51.821831   57975 controllermanager.go:549] Started "pv-protection"
I1017 10:40:51.821878   57975 pv_protection_controller.go:83] Starting PV protection controller
I1017 10:40:51.821892   57975 shared_informer.go:240] Waiting for caches to sync for PV protection
W1017 10:40:51.852268   57975 actual_state_of_world.go:506] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="127.0.0.1" does not exist
I1017 10:40:51.854904   57975 shared_informer.go:247] Caches are synced for namespace 
I1017 10:40:51.855318   57975 shared_informer.go:247] Caches are synced for ClusterRoleAggregator 
I1017 10:40:51.861538   57975 shared_informer.go:247] Caches are synced for certificate-csrapproving 
I1017 10:40:51.861617   57975 shared_informer.go:247] Caches are synced for HPA 
I1017 10:40:51.861638   57975 shared_informer.go:247] Caches are synced for persistent volume 
I1017 10:40:51.861649   57975 shared_informer.go:247] Caches are synced for ReplicaSet 
I1017 10:40:51.861738   57975 shared_informer.go:247] Caches are synced for TTL 
I1017 10:40:51.861758   57975 shared_informer.go:247] Caches are synced for service account 
I1017 10:40:51.861774   57975 shared_informer.go:247] Caches are synced for job 
I1017 10:40:51.864422   54413 controller.go:606] quota admission added evaluator for: serviceaccounts
E1017 10:40:51.876328   57975 clusterroleaggregation_controller.go:181] edit failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "edit": the object has been modified; please apply your changes to the latest version and try again
I1017 10:40:51.917883   57975 shared_informer.go:247] Caches are synced for daemon sets 
I1017 10:40:51.918185   57975 shared_informer.go:247] Caches are synced for deployment 
I1017 10:40:51.919650   57975 shared_informer.go:247] Caches are synced for PVC protection 
I1017 10:40:51.920317   57975 shared_informer.go:247] Caches are synced for endpoint 
I1017 10:40:51.920712   57975 shared_informer.go:247] Caches are synced for GC 
I1017 10:40:51.921625   57975 shared_informer.go:247] Caches are synced for endpoint_slice 
... skipping 134 lines ...
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_RESTMapper_evaluation_tests
+++ [1017 10:40:57] Creating namespace namespace-1602931257-11038
namespace/namespace-1602931257-11038 created
Context "test" modified.
+++ [1017 10:40:57] Testing RESTMapper
+++ [1017 10:40:57] "kubectl get unknownresourcetype" returns error as expected: error: the server doesn't have a resource type "unknownresourcetype"
+++ exit code: 0
NAME                              SHORTNAMES   APIVERSION                        NAMESPACED   KIND
bindings                                       v1                                true         Binding
componentstatuses                 cs           v1                                false        ComponentStatus
configmaps                        cm           v1                                true         ConfigMap
endpoints                         ep           v1                                true         Endpoints
... skipping 59 lines ...
namespace/namespace-1602931262-13295 created
Context "test" modified.
+++ [1017 10:41:02] Testing clusterroles
rbac.sh:29: Successful get clusterroles/cluster-admin {{.metadata.name}}: cluster-admin
(Brbac.sh:30: Successful get clusterrolebindings/cluster-admin {{.metadata.name}}: cluster-admin
(BSuccessful
message:Error from server (NotFound): clusterroles.rbac.authorization.k8s.io "pod-admin" not found
has:clusterroles.rbac.authorization.k8s.io "pod-admin" not found
clusterrole.rbac.authorization.k8s.io/pod-admin created (dry run)
clusterrole.rbac.authorization.k8s.io/pod-admin created (server dry run)
Successful
message:Error from server (NotFound): clusterroles.rbac.authorization.k8s.io "pod-admin" not found
has:clusterroles.rbac.authorization.k8s.io "pod-admin" not found
clusterrole.rbac.authorization.k8s.io/pod-admin created
rbac.sh:42: Successful get clusterrole/pod-admin {{range.rules}}{{range.verbs}}{{.}}:{{end}}{{end}}: *:
(BSuccessful
message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
clusterrole.rbac.authorization.k8s.io "pod-admin" deleted
... skipping 18 lines ...
(Bclusterrole.rbac.authorization.k8s.io/url-reader created
rbac.sh:61: Successful get clusterrole/url-reader {{range.rules}}{{range.verbs}}{{.}}:{{end}}{{end}}: get:
(Brbac.sh:62: Successful get clusterrole/url-reader {{range.rules}}{{range.nonResourceURLs}}{{.}}:{{end}}{{end}}: /logs/*:/healthz/*:
(Bclusterrole.rbac.authorization.k8s.io/aggregation-reader created
rbac.sh:64: Successful get clusterrole/aggregation-reader {{.metadata.name}}: aggregation-reader
(BSuccessful
message:Error from server (NotFound): clusterrolebindings.rbac.authorization.k8s.io "super-admin" not found
has:clusterrolebindings.rbac.authorization.k8s.io "super-admin" not found
clusterrolebinding.rbac.authorization.k8s.io/super-admin created (dry run)
clusterrolebinding.rbac.authorization.k8s.io/super-admin created (server dry run)
Successful
message:Error from server (NotFound): clusterrolebindings.rbac.authorization.k8s.io "super-admin" not found
has:clusterrolebindings.rbac.authorization.k8s.io "super-admin" not found
clusterrolebinding.rbac.authorization.k8s.io/super-admin created
rbac.sh:77: Successful get clusterrolebinding/super-admin {{range.subjects}}{{.name}}:{{end}}: super-admin:
(Bclusterrolebinding.rbac.authorization.k8s.io/super-admin subjects updated (dry run)
clusterrolebinding.rbac.authorization.k8s.io/super-admin subjects updated (server dry run)
rbac.sh:80: Successful get clusterrolebinding/super-admin {{range.subjects}}{{.name}}:{{end}}: super-admin:
... skipping 61 lines ...
rbac.sh:102: Successful get clusterrolebinding/super-admin {{range.subjects}}{{.name}}:{{end}}: super-admin:foo:test-all-user:
(Brbac.sh:103: Successful get clusterrolebinding/super-group {{range.subjects}}{{.name}}:{{end}}: the-group:foo:test-all-user:
(Brbac.sh:104: Successful get clusterrolebinding/super-sa {{range.subjects}}{{.name}}:{{end}}: sa-name:foo:test-all-user:
(Brolebinding.rbac.authorization.k8s.io/admin created (dry run)
rolebinding.rbac.authorization.k8s.io/admin created (server dry run)
Successful
message:Error from server (NotFound): rolebindings.rbac.authorization.k8s.io "admin" not found
has: not found
rolebinding.rbac.authorization.k8s.io/admin created
rbac.sh:113: Successful get rolebinding/admin {{.roleRef.kind}}: ClusterRole
(Brbac.sh:114: Successful get rolebinding/admin {{range.subjects}}{{.name}}:{{end}}: default-admin:
(Brolebinding.rbac.authorization.k8s.io/admin subjects updated
rbac.sh:116: Successful get rolebinding/admin {{range.subjects}}{{.name}}:{{end}}: default-admin:foo:
... skipping 29 lines ...
message:Warning: rbac.authorization.k8s.io/v1beta1 Role is deprecated in v1.17+, unavailable in v1.22+; use rbac.authorization.k8s.io/v1 Role
No resources found in namespace-1602931270-19017 namespace.
has:Role is deprecated
Successful
message:Warning: rbac.authorization.k8s.io/v1beta1 Role is deprecated in v1.17+, unavailable in v1.22+; use rbac.authorization.k8s.io/v1 Role
No resources found in namespace-1602931270-19017 namespace.
Error: 1 warning received
has:Role is deprecated
Successful
message:Warning: rbac.authorization.k8s.io/v1beta1 Role is deprecated in v1.17+, unavailable in v1.22+; use rbac.authorization.k8s.io/v1 Role
No resources found in namespace-1602931270-19017 namespace.
Error: 1 warning received
has:Error: 1 warning received
role.rbac.authorization.k8s.io/pod-admin created (dry run)
role.rbac.authorization.k8s.io/pod-admin created (server dry run)
Successful
message:Error from server (NotFound): roles.rbac.authorization.k8s.io "pod-admin" not found
has: not found
role.rbac.authorization.k8s.io/pod-admin created
rbac.sh:163: Successful get role/pod-admin {{range.rules}}{{range.verbs}}{{.}}:{{end}}{{end}}: *:
(Brbac.sh:164: Successful get role/pod-admin {{range.rules}}{{range.resources}}{{.}}:{{end}}{{end}}: pods:
(Brbac.sh:165: Successful get role/pod-admin {{range.rules}}{{range.apiGroups}}{{.}}:{{end}}{{end}}: :
(BSuccessful
... skipping 460 lines ...
has:valid-pod
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          1s
has:valid-pod
core.sh:190: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Berror: resource(s) were provided, but no name, label selector, or --all flag specified
core.sh:194: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bcore.sh:198: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Berror: setting 'all' parameter but found a non empty selector. 
core.sh:202: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bcore.sh:206: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "valid-pod" force deleted
core.sh:210: Successful get pods -l'name in (valid-pod)' {{range.items}}{{.metadata.name}}:{{end}}: 
(Bcore.sh:215: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-kubectl-describe-pod\" }}found{{end}}{{end}}:: :
... skipping 19 lines ...
(Bpoddisruptionbudget.policy/test-pdb-2 created
core.sh:259: Successful get pdb/test-pdb-2 --namespace=test-kubectl-describe-pod {{.spec.minAvailable}}: 50%
(Bpoddisruptionbudget.policy/test-pdb-3 created
core.sh:265: Successful get pdb/test-pdb-3 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 2
(Bpoddisruptionbudget.policy/test-pdb-4 created
core.sh:269: Successful get pdb/test-pdb-4 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 50%
(Berror: min-available and max-unavailable cannot be both specified
core.sh:275: Successful get pods --namespace=test-kubectl-describe-pod {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/env-test-pod created
matched TEST_CMD_1
matched <set to the key 'key-1' in secret 'test-secret'>
matched TEST_CMD_2
matched <set to the key 'key-2' of config map 'test-configmap'>
... skipping 224 lines ...
core.sh:534: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:3.2:
(BSuccessful
message:kubectl-create kubectl-patch
has:kubectl-patch
pod/valid-pod patched
core.sh:554: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
(B+++ [1017 10:41:42] "kubectl patch with resourceVersion 537" returns error as expected: Error from server (Conflict): Operation cannot be fulfilled on pods "valid-pod": the object has been modified; please apply your changes to the latest version and try again
pod "valid-pod" deleted
pod/valid-pod replaced
core.sh:578: Successful get pod valid-pod {{(index .spec.containers 0).name}}: replaced-k8s-serve-hostname
(BSuccessful
message:kubectl-create kubectl-patch kubectl-replace
has:kubectl-replace
Successful
message:error: --grace-period must have --force specified
has:\-\-grace-period must have \-\-force specified
Successful
message:error: --timeout must have --force specified
has:\-\-timeout must have \-\-force specified
node/node-v1-test created
W1017 10:41:43.484481   57975 actual_state_of_world.go:506] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="node-v1-test" does not exist
core.sh:606: Successful get node node-v1-test {{range.items}}{{if .metadata.annotations.a}}found{{end}}{{end}}:: :
(Bnode/node-v1-test replaced (server dry run)
node/node-v1-test replaced (dry run)
core.sh:631: Successful get node node-v1-test {{range.items}}{{if .metadata.annotations.a}}found{{end}}{{end}}:: :
(Bnode/node-v1-test replaced
core.sh:647: Successful get node node-v1-test {{.metadata.annotations.a}}: b
... skipping 29 lines ...
spec:
  containers:
  - image: k8s.gcr.io/pause:2.0
    name: kubernetes-pause
has:localonlyvalue
core.sh:683: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
(Berror: 'name' already has a value (valid-pod), and --overwrite is false
core.sh:687: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
(Bcore.sh:691: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
(Bpod/valid-pod labeled
core.sh:695: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod-super-sayan
(Bcore.sh:699: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
... skipping 83 lines ...
+++ Running case: test-cmd.run_kubectl_create_error_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_kubectl_create_error_tests
+++ [1017 10:41:54] Creating namespace namespace-1602931314-31149
namespace/namespace-1602931314-31149 created
Context "test" modified.
+++ [1017 10:41:54] Testing kubectl create with error
Error: must specify one of -f and -k

Create a resource from a file or from stdin.

 JSON and YAML formats are accepted.

Examples:
... skipping 43 lines ...

Usage:
  kubectl create -f FILENAME [options]

Use "kubectl <command> --help" for more information about a given command.
Use "kubectl options" for a list of global command-line options (applies to all commands).
+++ [1017 10:41:54] "kubectl create with empty string list returns error as expected: error: error validating "hack/testdata/invalid-rc-with-empty-args.yaml": error validating data: ValidationError(ReplicationController.spec.template.spec.containers[0].args): unknown object type "nil" in ReplicationController.spec.template.spec.containers[0].args[0]; if you choose to ignore these errors, turn validation off with --validate=false
kubectl convert is DEPRECATED and will be removed in a future version.
In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
+++ exit code: 0
Recording: run_kubectl_apply_tests
Running command: run_kubectl_apply_tests

... skipping 31 lines ...
I1017 10:41:58.057233   57975 event.go:291] "Event occurred" object="namespace-1602931315-2353/test-deployment-retainkeys-8695b756f8" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: test-deployment-retainkeys-8695b756f8-tbmgh"
deployment.apps "test-deployment-retainkeys" deleted
apply.sh:88: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/selector-test-pod created
apply.sh:92: Successful get pods selector-test-pod {{.metadata.labels.name}}: selector-test-pod
(BSuccessful
message:Error from server (NotFound): pods "selector-test-pod-dont-apply" not found
has:pods "selector-test-pod-dont-apply" not found
pod "selector-test-pod" deleted
apply.sh:101: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BW1017 10:41:59.262073   66438 helpers.go:567] --dry-run=true is deprecated (boolean value) and can be replaced with --dry-run=client.
pod/test-pod created (dry run)
pod/test-pod created (dry run)
... skipping 34 lines ...
(Bpod/b created
apply.sh:196: Successful get pods a {{.metadata.name}}: a
(Bapply.sh:197: Successful get pods b -n nsb {{.metadata.name}}: b
(Bpod "a" deleted
pod "b" deleted
Successful
message:error: all resources selected for prune without explicitly passing --all. To prune all resources, pass the --all flag. If you did not mean to prune all resources, specify a label selector
has:all resources selected for prune without explicitly passing --all
I1017 10:42:06.578069   54413 client.go:360] parsed scheme: "passthrough"
I1017 10:42:06.578131   54413 passthrough.go:48] ccResolverWrapper: sending update to cc: {[{http://127.0.0.1:2379  <nil> 0 <nil>}] <nil> <nil>}
I1017 10:42:06.578142   54413 clientconn.go:948] ClientConn switching balancer to "pick_first"
pod/a created
pod/b created
... skipping 44 lines ...
(Bpod/b unchanged
pod/a pruned
Warning: extensions/v1beta1 Ingress is deprecated in v1.14+, unavailable in v1.22+; use networking.k8s.io/v1 Ingress
apply.sh:254: Successful get pods -n nsb {{range.items}}{{.metadata.name}}:{{end}}: b:
(Bnamespace "nsb" deleted
Successful
message:error: the namespace from the provided object "nsb" does not match the namespace "foo". You must pass '--namespace=nsb' to perform this operation.
has:the namespace from the provided object "nsb" does not match the namespace "foo".
apply.sh:265: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(Bservice/a created
apply.sh:269: Successful get services a {{.metadata.name}}: a
(BSuccessful
message:The Service "a" is invalid: spec.clusterIP: Invalid value: "10.0.0.12": field is immutable
... skipping 25 lines ...
(Bapply.sh:291: Successful get deployment test-the-deployment {{.metadata.name}}: test-the-deployment
(Bapply.sh:292: Successful get service test-the-service {{.metadata.name}}: test-the-service
(Bconfigmap "test-the-map" deleted
service "test-the-service" deleted
deployment.apps "test-the-deployment" deleted
Successful
message:Error from server (NotFound): namespaces "multi-resource-ns" not found
has:namespaces "multi-resource-ns" not found
apply.sh:300: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:namespace/multi-resource-ns created
Error from server (NotFound): error when creating "hack/testdata/multi-resource-1.yaml": namespaces "multi-resource-ns" not found
has:namespaces "multi-resource-ns" not found
Successful
message:Error from server (NotFound): pods "test-pod" not found
has:pods "test-pod" not found
pod/test-pod created
namespace/multi-resource-ns unchanged
apply.sh:308: Successful get pods test-pod -n multi-resource-ns {{.metadata.name}}: test-pod
(Bpod "test-pod" deleted
namespace "multi-resource-ns" deleted
I1017 10:42:36.183385   57975 namespace_controller.go:185] Namespace has been deleted nsb
I1017 10:42:39.004883   54413 client.go:360] parsed scheme: "passthrough"
I1017 10:42:39.004950   54413 passthrough.go:48] ccResolverWrapper: sending update to cc: {[{http://127.0.0.1:2379  <nil> 0 <nil>}] <nil> <nil>}
I1017 10:42:39.005006   54413 clientconn.go:948] ClientConn switching balancer to "pick_first"
apply.sh:314: Successful get configmaps {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:configmap/foo created
error: unable to recognize "hack/testdata/multi-resource-2.yaml": no matches for kind "Bogus" in version "example.com/v1"
has:no matches for kind "Bogus" in version "example.com/v1"
apply.sh:320: Successful get configmaps foo {{.metadata.name}}: foo
(Bconfigmap "foo" deleted
apply.sh:326: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:pod/pod-a created
... skipping 6 lines ...
pod "pod-c" deleted
apply.sh:334: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapply.sh:338: Successful get crds {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:Warning: apiextensions.k8s.io/v1beta1 CustomResourceDefinition is deprecated in v1.16+, unavailable in v1.22+; use apiextensions.k8s.io/v1 CustomResourceDefinition
customresourcedefinition.apiextensions.k8s.io/widgets.example.com created
error: unable to recognize "hack/testdata/multi-resource-4.yaml": no matches for kind "Widget" in version "example.com/v1"
has:no matches for kind "Widget" in version "example.com/v1"
I1017 10:42:41.721468   54413 client.go:360] parsed scheme: "endpoint"
I1017 10:42:41.721521   54413 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
Successful
message:Error from server (NotFound): widgets.example.com "foo" not found
has:widgets.example.com "foo" not found
apply.sh:344: Successful get crds widgets.example.com {{.metadata.name}}: widgets.example.com
(BI1017 10:42:42.024242   54413 controller.go:606] quota admission added evaluator for: widgets.example.com
widget.example.com/foo created
Warning: apiextensions.k8s.io/v1beta1 CustomResourceDefinition is deprecated in v1.16+, unavailable in v1.22+; use apiextensions.k8s.io/v1 CustomResourceDefinition
customresourcedefinition.apiextensions.k8s.io/widgets.example.com unchanged
... skipping 34 lines ...
message:801
has:801
pod "test-pod" deleted
apply.sh:403: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(B+++ [1017 10:42:44] Testing upgrade kubectl client-side apply to server-side apply
pod/test-pod created
error: Apply failed with 1 conflict: conflict with "kubectl-client-side-apply" using v1: .metadata.labels.name
Please review the fields above--they currently have other managers. Here
are the ways you can resolve this warning:
* If you intend to manage all of these fields, please re-run the apply
  command with the `--force-conflicts` flag.
* If you do not intend to manage all of the fields, please edit your
  manifest to remove references to the fields that should keep their
... skipping 79 lines ...
(Bpod "nginx-extensions" deleted
Successful
message:pod/test1 created
has:pod/test1 created
pod "test1" deleted
Successful
message:error: Invalid image name "InvalidImageName": invalid reference format
has:error: Invalid image name "InvalidImageName": invalid reference format
+++ exit code: 0
Recording: run_kubectl_create_filter_tests
Running command: run_kubectl_create_filter_tests

+++ Running case: test-cmd.run_kubectl_create_filter_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
... skipping 3 lines ...
Context "test" modified.
+++ [1017 10:42:48] Testing kubectl create filter
create.sh:50: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/selector-test-pod created
create.sh:54: Successful get pods selector-test-pod {{.metadata.labels.name}}: selector-test-pod
(BSuccessful
message:Error from server (NotFound): pods "selector-test-pod-dont-apply" not found
has:pods "selector-test-pod-dont-apply" not found
pod "selector-test-pod" deleted
+++ exit code: 0
Recording: run_kubectl_apply_deployments_tests
Running command: run_kubectl_apply_deployments_tests

... skipping 29 lines ...
I1017 10:42:51.573029   57975 event.go:291] "Event occurred" object="namespace-1602931369-16454/nginx" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-9bb9c4878 to 3"
I1017 10:42:51.577387   57975 event.go:291] "Event occurred" object="namespace-1602931369-16454/nginx-9bb9c4878" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-9bb9c4878-rr7pf"
I1017 10:42:51.585543   57975 event.go:291] "Event occurred" object="namespace-1602931369-16454/nginx-9bb9c4878" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-9bb9c4878-rzsx7"
I1017 10:42:51.585586   57975 event.go:291] "Event occurred" object="namespace-1602931369-16454/nginx-9bb9c4878" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-9bb9c4878-89n54"
apps.sh:152: Successful get deployment nginx {{.metadata.name}}: nginx
(BSuccessful
message:Error from server (Conflict): error when applying patch:
{"metadata":{"annotations":{"kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"Deployment\",\"metadata\":{\"annotations\":{},\"labels\":{\"name\":\"nginx\"},\"name\":\"nginx\",\"namespace\":\"namespace-1602931369-16454\",\"resourceVersion\":\"99\"},\"spec\":{\"replicas\":3,\"selector\":{\"matchLabels\":{\"name\":\"nginx2\"}},\"template\":{\"metadata\":{\"labels\":{\"name\":\"nginx2\"}},\"spec\":{\"containers\":[{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"nginx\",\"ports\":[{\"containerPort\":80}]}]}}}}\n"},"resourceVersion":"99"},"spec":{"selector":{"matchLabels":{"name":"nginx2"}},"template":{"metadata":{"labels":{"name":"nginx2"}}}}}
to:
Resource: "apps/v1, Resource=deployments", GroupVersionKind: "apps/v1, Kind=Deployment"
Name: "nginx", Namespace: "namespace-1602931369-16454"
for: "hack/testdata/deployment-label-change2.yaml": Operation cannot be fulfilled on deployments.apps "nginx": the object has been modified; please apply your changes to the latest version and try again
has:Error from server (Conflict)
deployment.apps/nginx configured
I1017 10:43:00.222683   57975 event.go:291] "Event occurred" object="namespace-1602931369-16454/nginx" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-6dd6cfdb57 to 3"
I1017 10:43:00.228590   57975 event.go:291] "Event occurred" object="namespace-1602931369-16454/nginx-6dd6cfdb57" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-6dd6cfdb57-h4v2l"
I1017 10:43:00.233586   57975 event.go:291] "Event occurred" object="namespace-1602931369-16454/nginx-6dd6cfdb57" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-6dd6cfdb57-jsvbl"
I1017 10:43:00.233863   57975 event.go:291] "Event occurred" object="namespace-1602931369-16454/nginx-6dd6cfdb57" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-6dd6cfdb57-dln6d"
Successful
... skipping 308 lines ...
+++ [1017 10:43:09] Creating namespace namespace-1602931389-26004
namespace/namespace-1602931389-26004 created
Context "test" modified.
+++ [1017 10:43:09] Testing kubectl get
get.sh:29: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
get.sh:37: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
get.sh:45: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:{
    "apiVersion": "v1",
    "items": [],
... skipping 23 lines ...
has not:No resources found
Successful
message:NAME
has not:No resources found
get.sh:73: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:error: the server doesn't have a resource type "foobar"
has not:No resources found
Successful
message:No resources found in namespace-1602931389-26004 namespace.
has:No resources found
Successful
message:
has not:No resources found
Successful
message:No resources found in namespace-1602931389-26004 namespace.
has:No resources found
get.sh:93: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
Successful
message:Error from server (NotFound): pods "abc" not found
has not:List
Successful
message:I1017 10:43:11.109138   69940 loader.go:375] Config loaded from file:  /tmp/tmp.ujNjJcNcKe/.kube/config
I1017 10:43:11.110578   69940 round_trippers.go:444] GET http://127.0.0.1:8080/version?timeout=32s 200 OK in 0 milliseconds
I1017 10:43:11.134162   69940 round_trippers.go:444] GET http://127.0.0.1:8080/api/v1/namespaces/default/pods 200 OK in 2 milliseconds
I1017 10:43:11.136057   69940 round_trippers.go:444] GET http://127.0.0.1:8080/api/v1/namespaces/default/replicationcontrollers 200 OK in 1 milliseconds
... skipping 635 lines ...
}
get.sh:158: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(B<no value>Successful
message:valid-pod:
has:valid-pod:
Successful
message:error: error executing jsonpath "{.missing}": Error executing template: missing is not found. Printing more information for debugging the template:
	template was:
		{.missing}
	object given to jsonpath engine was:
		map[string]interface {}{"apiVersion":"v1", "kind":"Pod", "metadata":map[string]interface {}{"creationTimestamp":"2020-10-17T10:43:18Z", "labels":map[string]interface {}{"name":"valid-pod"}, "managedFields":[]interface {}{map[string]interface {}{"apiVersion":"v1", "fieldsType":"FieldsV1", "fieldsV1":map[string]interface {}{"f:metadata":map[string]interface {}{"f:labels":map[string]interface {}{".":map[string]interface {}{}, "f:name":map[string]interface {}{}}}, "f:spec":map[string]interface {}{"f:containers":map[string]interface {}{"k:{\"name\":\"kubernetes-serve-hostname\"}":map[string]interface {}{".":map[string]interface {}{}, "f:image":map[string]interface {}{}, "f:imagePullPolicy":map[string]interface {}{}, "f:name":map[string]interface {}{}, "f:resources":map[string]interface {}{".":map[string]interface {}{}, "f:limits":map[string]interface {}{".":map[string]interface {}{}, "f:cpu":map[string]interface {}{}, "f:memory":map[string]interface {}{}}, "f:requests":map[string]interface {}{".":map[string]interface {}{}, "f:cpu":map[string]interface {}{}, "f:memory":map[string]interface {}{}}}, "f:terminationMessagePath":map[string]interface {}{}, "f:terminationMessagePolicy":map[string]interface {}{}}}, "f:dnsPolicy":map[string]interface {}{}, "f:enableServiceLinks":map[string]interface {}{}, "f:restartPolicy":map[string]interface {}{}, "f:schedulerName":map[string]interface {}{}, "f:securityContext":map[string]interface {}{}, "f:terminationGracePeriodSeconds":map[string]interface {}{}}}, "manager":"kubectl-create", "operation":"Update", "time":"2020-10-17T10:43:18Z"}}, "name":"valid-pod", "namespace":"namespace-1602931398-5673", "resourceVersion":"961", "uid":"feb86f3b-aee4-4af3-a377-8f071a9429b3"}, "spec":map[string]interface {}{"containers":[]interface {}{map[string]interface {}{"image":"k8s.gcr.io/serve_hostname", "imagePullPolicy":"Always", "name":"kubernetes-serve-hostname", "resources":map[string]interface {}{"limits":map[string]interface {}{"cpu":"1", "memory":"512Mi"}, "requests":map[string]interface {}{"cpu":"1", "memory":"512Mi"}}, "terminationMessagePath":"/dev/termination-log", "terminationMessagePolicy":"File"}}, "dnsPolicy":"ClusterFirst", "enableServiceLinks":true, "preemptionPolicy":"PreemptLowerPriority", "priority":0, "restartPolicy":"Always", "schedulerName":"default-scheduler", "securityContext":map[string]interface {}{}, "terminationGracePeriodSeconds":30}, "status":map[string]interface {}{"phase":"Pending", "qosClass":"Guaranteed"}}
has:missing is not found
error: error executing template "{{.missing}}": template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing"
Successful
message:Error executing template: template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing". Printing more information for debugging the template:
	template was:
		{{.missing}}
	raw data was:
		{"apiVersion":"v1","kind":"Pod","metadata":{"creationTimestamp":"2020-10-17T10:43:18Z","labels":{"name":"valid-pod"},"managedFields":[{"apiVersion":"v1","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:labels":{".":{},"f:name":{}}},"f:spec":{"f:containers":{"k:{\"name\":\"kubernetes-serve-hostname\"}":{".":{},"f:image":{},"f:imagePullPolicy":{},"f:name":{},"f:resources":{".":{},"f:limits":{".":{},"f:cpu":{},"f:memory":{}},"f:requests":{".":{},"f:cpu":{},"f:memory":{}}},"f:terminationMessagePath":{},"f:terminationMessagePolicy":{}}},"f:dnsPolicy":{},"f:enableServiceLinks":{},"f:restartPolicy":{},"f:schedulerName":{},"f:securityContext":{},"f:terminationGracePeriodSeconds":{}}},"manager":"kubectl-create","operation":"Update","time":"2020-10-17T10:43:18Z"}],"name":"valid-pod","namespace":"namespace-1602931398-5673","resourceVersion":"961","uid":"feb86f3b-aee4-4af3-a377-8f071a9429b3"},"spec":{"containers":[{"image":"k8s.gcr.io/serve_hostname","imagePullPolicy":"Always","name":"kubernetes-serve-hostname","resources":{"limits":{"cpu":"1","memory":"512Mi"},"requests":{"cpu":"1","memory":"512Mi"}},"terminationMessagePath":"/dev/termination-log","terminationMessagePolicy":"File"}],"dnsPolicy":"ClusterFirst","enableServiceLinks":true,"preemptionPolicy":"PreemptLowerPriority","priority":0,"restartPolicy":"Always","schedulerName":"default-scheduler","securityContext":{},"terminationGracePeriodSeconds":30},"status":{"phase":"Pending","qosClass":"Guaranteed"}}
	object given to template engine was:
		map[apiVersion:v1 kind:Pod metadata:map[creationTimestamp:2020-10-17T10:43:18Z labels:map[name:valid-pod] managedFields:[map[apiVersion:v1 fieldsType:FieldsV1 fieldsV1:map[f:metadata:map[f:labels:map[.:map[] f:name:map[]]] f:spec:map[f:containers:map[k:{"name":"kubernetes-serve-hostname"}:map[.:map[] f:image:map[] f:imagePullPolicy:map[] f:name:map[] f:resources:map[.:map[] f:limits:map[.:map[] f:cpu:map[] f:memory:map[]] f:requests:map[.:map[] f:cpu:map[] f:memory:map[]]] f:terminationMessagePath:map[] f:terminationMessagePolicy:map[]]] f:dnsPolicy:map[] f:enableServiceLinks:map[] f:restartPolicy:map[] f:schedulerName:map[] f:securityContext:map[] f:terminationGracePeriodSeconds:map[]]] manager:kubectl-create operation:Update time:2020-10-17T10:43:18Z]] name:valid-pod namespace:namespace-1602931398-5673 resourceVersion:961 uid:feb86f3b-aee4-4af3-a377-8f071a9429b3] spec:map[containers:[map[image:k8s.gcr.io/serve_hostname imagePullPolicy:Always name:kubernetes-serve-hostname resources:map[limits:map[cpu:1 memory:512Mi] requests:map[cpu:1 memory:512Mi]] terminationMessagePath:/dev/termination-log terminationMessagePolicy:File]] dnsPolicy:ClusterFirst enableServiceLinks:true preemptionPolicy:PreemptLowerPriority priority:0 restartPolicy:Always schedulerName:default-scheduler securityContext:map[] terminationGracePeriodSeconds:30] status:map[phase:Pending qosClass:Guaranteed]]
... skipping 156 lines ...
  terminationGracePeriodSeconds: 30
status:
  phase: Pending
  qosClass: Guaranteed
has:name: valid-pod
Successful
message:Error from server (NotFound): pods "invalid-pod" not found
has:"invalid-pod" not found
pod "valid-pod" deleted
get.sh:196: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/redis-master created
pod/valid-pod created
Successful
... skipping 36 lines ...
+++ [1017 10:43:24] Creating namespace namespace-1602931404-9656
namespace/namespace-1602931404-9656 created
Context "test" modified.
+++ [1017 10:43:24] Testing kubectl exec POD COMMAND
Successful
message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead.
Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
pod/test-pod created
Successful
message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead.
Error from server (BadRequest): pod test-pod does not have a host assigned
has not:pods "test-pod" not found
Successful
message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead.
Error from server (BadRequest): pod test-pod does not have a host assigned
has not:pod or type/name must be specified
pod "test-pod" deleted
+++ exit code: 0
Recording: run_kubectl_exec_resource_name_tests
Running command: run_kubectl_exec_resource_name_tests

... skipping 3 lines ...
+++ [1017 10:43:25] Creating namespace namespace-1602931405-6265
namespace/namespace-1602931405-6265 created
Context "test" modified.
+++ [1017 10:43:25] Testing kubectl exec TYPE/NAME COMMAND
Successful
message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead.
error: the server doesn't have a resource type "foo"
has:error:
Successful
message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead.
Error from server (NotFound): deployments.apps "bar" not found
has:"bar" not found
pod/test-pod created
replicaset.apps/frontend created
I1017 10:43:26.226633   57975 event.go:291] "Event occurred" object="namespace-1602931405-6265/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-75s8x"
I1017 10:43:26.230651   57975 event.go:291] "Event occurred" object="namespace-1602931405-6265/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-4klr7"
I1017 10:43:26.233187   57975 event.go:291] "Event occurred" object="namespace-1602931405-6265/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-k4vpt"
configmap/test-set-env-config created
Successful
message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead.
error: cannot attach to *v1.ConfigMap: selector for *v1.ConfigMap not implemented
has:not implemented
Successful
message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead.
Error from server (BadRequest): pod test-pod does not have a host assigned
has not:not found
Successful
message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead.
Error from server (BadRequest): pod test-pod does not have a host assigned
has not:pod, type/name or --filename must be specified
Successful
message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead.
Error from server (BadRequest): pod frontend-4klr7 does not have a host assigned
has not:not found
Successful
message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead.
Error from server (BadRequest): pod frontend-4klr7 does not have a host assigned
has not:pod, type/name or --filename must be specified
pod "test-pod" deleted
replicaset.apps "frontend" deleted
configmap "test-set-env-config" deleted
+++ exit code: 0
Recording: run_create_secret_tests
Running command: run_create_secret_tests

+++ Running case: test-cmd.run_create_secret_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_create_secret_tests
Successful
message:Error from server (NotFound): secrets "mysecret" not found
has:secrets "mysecret" not found
Successful
message:user-specified
has:user-specified
Successful
message:Error from server (NotFound): secrets "mysecret" not found
has:secrets "mysecret" not found
Successful
{"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-update-cm","namespace":"default","uid":"639c9f6d-19c6-408e-a618-29cd6a481b97","resourceVersion":"1038","creationTimestamp":"2020-10-17T10:43:27Z"}}
Successful
message:{"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-update-cm","namespace":"default","uid":"639c9f6d-19c6-408e-a618-29cd6a481b97","resourceVersion":"1040","creationTimestamp":"2020-10-17T10:43:27Z"},"data":{"key1":"config1"}}
has:uid
Successful
message:{"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-update-cm","namespace":"default","uid":"639c9f6d-19c6-408e-a618-29cd6a481b97","resourceVersion":"1040","creationTimestamp":"2020-10-17T10:43:27Z"},"data":{"key1":"config1"}}
has:config1
{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Success","details":{"name":"tester-update-cm","kind":"configmaps","uid":"639c9f6d-19c6-408e-a618-29cd6a481b97"}}
Successful
message:Error from server (NotFound): configmaps "tester-update-cm" not found
has:configmaps "tester-update-cm" not found
+++ exit code: 0
Recording: run_kubectl_create_kustomization_directory_tests
Running command: run_kubectl_create_kustomization_directory_tests

+++ Running case: test-cmd.run_kubectl_create_kustomization_directory_tests 
... skipping 172 lines ...
has:Timeout
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          1s
has:valid-pod
Successful
message:error: Invalid timeout value. Timeout must be a single integer in seconds, or an integer followed by a corresponding time unit (e.g. 1s | 2m | 3h)
has:Invalid timeout value
pod "valid-pod" deleted
+++ exit code: 0
Recording: run_crd_tests
Running command: run_crd_tests

... skipping 240 lines ...
foo.company.com/test patched
crd.sh:236: Successful get foos/test {{.patched}}: value1
(Bfoo.company.com/test patched
crd.sh:238: Successful get foos/test {{.patched}}: value2
(Bfoo.company.com/test patched
crd.sh:240: Successful get foos/test {{.patched}}: <no value>
(B+++ [1017 10:43:38] "kubectl patch --local" returns error as expected for CustomResource: error: cannot apply strategic merge patch for company.com/v1, Kind=Foo locally, try --type merge
{
    "apiVersion": "company.com/v1",
    "kind": "Foo",
    "metadata": {
        "annotations": {
            "kubernetes.io/change-cause": "kubectl patch foos/test --server=http://127.0.0.1:8080 --match-server-version=true --patch={\"patched\":null} --type=merge --record=true"
... skipping 370 lines ...
(Bcrd.sh:450: Successful get bars {{range.items}}{{.metadata.name}}:{{end}}: 
(Bnamespace/non-native-resources created
bar.company.com/test created
crd.sh:455: Successful get bars {{len .items}}: 1
(Bnamespace "non-native-resources" deleted
crd.sh:458: Successful get bars {{len .items}}: 0
(BError from server (NotFound): namespaces "non-native-resources" not found
customresourcedefinition.apiextensions.k8s.io "foos.company.com" deleted
customresourcedefinition.apiextensions.k8s.io "bars.company.com" deleted
customresourcedefinition.apiextensions.k8s.io "resources.mygroup.example.com" deleted
customresourcedefinition.apiextensions.k8s.io "validfoos.company.com" deleted
+++ exit code: 0
+++ [1017 10:44:07] Testing recursive resources
... skipping 2 lines ...
Context "test" modified.
generic-resources.sh:202: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bgeneric-resources.sh:206: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:pod/busybox0 created
pod/busybox1 created
error: error validating "hack/testdata/recursive/pod/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
has:error validating data: kind not set
W1017 10:44:08.509652   54413 cacher.go:148] Terminating all watchers from cacher *unstructured.Unstructured
E1017 10:44:08.510840   57975 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:211: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BW1017 10:44:08.615296   54413 cacher.go:148] Terminating all watchers from cacher *unstructured.Unstructured
E1017 10:44:08.616413   57975 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
W1017 10:44:08.725954   54413 cacher.go:148] Terminating all watchers from cacher *unstructured.Unstructured
E1017 10:44:08.727080   57975 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:220: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: busybox:busybox:
(BSuccessful
message:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
W1017 10:44:08.830593   54413 cacher.go:148] Terminating all watchers from cacher *unstructured.Unstructured
E1017 10:44:08.831832   57975 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:227: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:231: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced:
(BSuccessful
message:pod/busybox0 replaced
pod/busybox1 replaced
error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
has:error validating data: kind not set
generic-resources.sh:236: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:Name:         busybox0
Namespace:    namespace-1602931447-10938
Priority:     0
Node:         <none>
... skipping 155 lines ...
Node-Selectors:   <none>
Tolerations:      <none>
Events:           <none>
unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:246: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE1017 10:44:09.512161   57975 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:250: Successful get pods {{range.items}}{{.metadata.annotations.annotatekey}}:{{end}}: annotatevalue:annotatevalue:
(BSuccessful
message:pod/busybox0 annotated
pod/busybox1 annotated
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:255: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE1017 10:44:09.913335   57975 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1017 10:44:10.009744   57975 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1017 10:44:10.108651   57975 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:259: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced:
(BSuccessful
message:Warning: resource pods/busybox0 is missing the kubectl.kubernetes.io/last-applied-configuration annotation which is required by kubectl apply. kubectl apply should only be used on resources created declaratively by either kubectl create --save-config or kubectl apply. The missing annotation will be patched automatically.
pod/busybox0 configured
Warning: resource pods/busybox1 is missing the kubectl.kubernetes.io/last-applied-configuration annotation which is required by kubectl apply. kubectl apply should only be used on resources created declaratively by either kubectl create --save-config or kubectl apply. The missing annotation will be patched automatically.
pod/busybox1 configured
error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
has:error validating data: kind not set
generic-resources.sh:265: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/nginx created
I1017 10:44:10.455581   57975 event.go:291] "Event occurred" object="namespace-1602931447-10938/nginx" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-54785cbcb8 to 3"
I1017 10:44:10.460756   57975 event.go:291] "Event occurred" object="namespace-1602931447-10938/nginx-54785cbcb8" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-54785cbcb8-b2tps"
I1017 10:44:10.463970   57975 event.go:291] "Event occurred" object="namespace-1602931447-10938/nginx-54785cbcb8" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-54785cbcb8-sxsdz"
I1017 10:44:10.467701   57975 event.go:291] "Event occurred" object="namespace-1602931447-10938/nginx-54785cbcb8" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-54785cbcb8-glnjp"
... skipping 48 lines ...
deployment.apps "nginx" deleted
generic-resources.sh:281: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:285: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:kubectl convert is DEPRECATED and will be removed in a future version.
In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:290: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:busybox0:busybox1:
Successful
message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:299: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bpod/busybox0 labeled
pod/busybox1 labeled
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
generic-resources.sh:304: Successful get pods {{range.items}}{{.metadata.labels.mylabel}}:{{end}}: myvalue:myvalue:
(BSuccessful
message:pod/busybox0 labeled
pod/busybox1 labeled
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:309: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE1017 10:44:11.815959   57975 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/busybox0 patched
pod/busybox1 patched
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
generic-resources.sh:314: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: prom/busybox:prom/busybox:
(BSuccessful
message:pod/busybox0 patched
pod/busybox1 patched
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:319: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BI1017 10:44:12.233876   57975 namespace_controller.go:185] Namespace has been deleted non-native-resources
generic-resources.sh:323: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "busybox0" force deleted
pod "busybox1" force deleted
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:328: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1017 10:44:12.343484   57975 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/busybox0 created
I1017 10:44:12.538216   57975 event.go:291] "Event occurred" object="namespace-1602931447-10938/busybox0" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: busybox0-t2tbt"
replicationcontroller/busybox1 created
error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I1017 10:44:12.545362   57975 event.go:291] "Event occurred" object="namespace-1602931447-10938/busybox1" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: busybox1-wzs7r"
generic-resources.sh:332: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:337: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:338: Successful get rc busybox0 {{.spec.replicas}}: 1
(Bgeneric-resources.sh:339: Successful get rc busybox1 {{.spec.replicas}}: 1
(Bgeneric-resources.sh:344: Successful get hpa busybox0 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80
(BE1017 10:44:13.153252   57975 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1017 10:44:13.181283   57975 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:345: Successful get hpa busybox1 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80
(BSuccessful
message:horizontalpodautoscaler.autoscaling/busybox0 autoscaled
horizontalpodautoscaler.autoscaling/busybox1 autoscaled
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
horizontalpodautoscaler.autoscaling "busybox0" deleted
horizontalpodautoscaler.autoscaling "busybox1" deleted
generic-resources.sh:353: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:354: Successful get rc busybox0 {{.spec.replicas}}: 1
(Bgeneric-resources.sh:355: Successful get rc busybox1 {{.spec.replicas}}: 1
(Bgeneric-resources.sh:359: Successful get service busybox0 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
(Bgeneric-resources.sh:360: Successful get service busybox1 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
(BSuccessful
message:service/busybox0 exposed
service/busybox1 exposed
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
generic-resources.sh:366: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:367: Successful get rc busybox0 {{.spec.replicas}}: 1
(Bgeneric-resources.sh:368: Successful get rc busybox1 {{.spec.replicas}}: 1
(BI1017 10:44:14.294370   57975 event.go:291] "Event occurred" object="namespace-1602931447-10938/busybox0" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: busybox0-rnkbm"
I1017 10:44:14.306044   57975 event.go:291] "Event occurred" object="namespace-1602931447-10938/busybox1" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: busybox1-rv2br"
generic-resources.sh:372: Successful get rc busybox0 {{.spec.replicas}}: 2
(Bgeneric-resources.sh:373: Successful get rc busybox1 {{.spec.replicas}}: 2
(BSuccessful
message:replicationcontroller/busybox0 scaled
replicationcontroller/busybox1 scaled
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
generic-resources.sh:378: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:382: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
replicationcontroller "busybox0" force deleted
replicationcontroller "busybox1" force deleted
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
generic-resources.sh:387: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/nginx1-deployment created
I1017 10:44:15.081046   57975 event.go:291] "Event occurred" object="namespace-1602931447-10938/nginx1-deployment" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx1-deployment-758b5949b6 to 2"
deployment.apps/nginx0-deployment created
error: error validating "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I1017 10:44:15.087423   57975 event.go:291] "Event occurred" object="namespace-1602931447-10938/nginx1-deployment-758b5949b6" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx1-deployment-758b5949b6-q46hv"
I1017 10:44:15.089992   57975 event.go:291] "Event occurred" object="namespace-1602931447-10938/nginx0-deployment" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx0-deployment-75db9cdfd9 to 2"
I1017 10:44:15.093118   57975 event.go:291] "Event occurred" object="namespace-1602931447-10938/nginx1-deployment-758b5949b6" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx1-deployment-758b5949b6-49hgz"
I1017 10:44:15.104995   57975 event.go:291] "Event occurred" object="namespace-1602931447-10938/nginx0-deployment-75db9cdfd9" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx0-deployment-75db9cdfd9-n28kd"
I1017 10:44:15.108197   57975 event.go:291] "Event occurred" object="namespace-1602931447-10938/nginx0-deployment-75db9cdfd9" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx0-deployment-75db9cdfd9-vg8bq"
generic-resources.sh:391: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx0-deployment:nginx1-deployment:
(Bgeneric-resources.sh:392: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:k8s.gcr.io/nginx:1.7.9:
(Bgeneric-resources.sh:396: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:k8s.gcr.io/nginx:1.7.9:
(BSuccessful
message:deployment.apps/nginx1-deployment skipped rollback (current template already matches revision 1)
deployment.apps/nginx0-deployment skipped rollback (current template already matches revision 1)
error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:Object 'Kind' is missing
deployment.apps/nginx1-deployment paused
deployment.apps/nginx0-deployment paused
generic-resources.sh:404: Successful get deployment {{range.items}}{{.spec.paused}}:{{end}}: true:true:
(BSuccessful
message:unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
... skipping 10 lines ...
1         <none>

deployment.apps/nginx0-deployment 
REVISION  CHANGE-CAUSE
1         <none>

error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:nginx0-deployment
Successful
message:deployment.apps/nginx1-deployment 
REVISION  CHANGE-CAUSE
1         <none>

deployment.apps/nginx0-deployment 
REVISION  CHANGE-CAUSE
1         <none>

error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:nginx1-deployment
Successful
message:deployment.apps/nginx1-deployment 
REVISION  CHANGE-CAUSE
1         <none>

deployment.apps/nginx0-deployment 
REVISION  CHANGE-CAUSE
1         <none>

error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:Object 'Kind' is missing
warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
deployment.apps "nginx1-deployment" force deleted
deployment.apps "nginx0-deployment" force deleted
error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
generic-resources.sh:426: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicationcontroller/busybox0 created
I1017 10:44:17.422400   57975 event.go:291] "Event occurred" object="namespace-1602931447-10938/busybox0" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: busybox0-n6tpp"
replicationcontroller/busybox1 created
error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I1017 10:44:17.429171   57975 event.go:291] "Event occurred" object="namespace-1602931447-10938/busybox1" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: busybox1-vmnj7"
generic-resources.sh:430: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:no rollbacker has been implemented for "ReplicationController"
no rollbacker has been implemented for "ReplicationController"
unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
... skipping 2 lines ...
message:no rollbacker has been implemented for "ReplicationController"
no rollbacker has been implemented for "ReplicationController"
unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" pausing is not supported
error: replicationcontrollers "busybox1" pausing is not supported
has:Object 'Kind' is missing
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" pausing is not supported
error: replicationcontrollers "busybox1" pausing is not supported
has:replicationcontrollers "busybox0" pausing is not supported
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" pausing is not supported
error: replicationcontrollers "busybox1" pausing is not supported
has:replicationcontrollers "busybox1" pausing is not supported
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" resuming is not supported
error: replicationcontrollers "busybox1" resuming is not supported
has:Object 'Kind' is missing
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" resuming is not supported
error: replicationcontrollers "busybox1" resuming is not supported
has:replicationcontrollers "busybox0" resuming is not supported
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" resuming is not supported
error: replicationcontrollers "busybox1" resuming is not supported
has:replicationcontrollers "busybox1" resuming is not supported
warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
replicationcontroller "busybox0" force deleted
replicationcontroller "busybox1" force deleted
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
E1017 10:44:17.936845   57975 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1017 10:44:18.010640   57975 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1017 10:44:18.155287   57975 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1017 10:44:18.294323   57975 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_namespace_tests
Running command: run_namespace_tests

+++ Running case: test-cmd.run_namespace_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_namespace_tests
+++ [1017 10:44:18] Testing kubectl(v1:namespaces)
Successful
message:Error from server (NotFound): namespaces "my-namespace" not found
has: not found
namespace/my-namespace created (dry run)
namespace/my-namespace created (server dry run)
Successful
message:Error from server (NotFound): namespaces "my-namespace" not found
has: not found
namespace/my-namespace created
core.sh:1459: Successful get namespaces/my-namespace {{.metadata.name}}: my-namespace
(Bnamespace "my-namespace" deleted
namespace/my-namespace condition met
Successful
message:Error from server (NotFound): namespaces "my-namespace" not found
has: not found
namespace/my-namespace created
core.sh:1468: Successful get namespaces/my-namespace {{.metadata.name}}: my-namespace
(BSuccessful
message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
namespace "kube-node-lease" deleted
... skipping 31 lines ...
namespace "namespace-1602931408-13143" deleted
namespace "namespace-1602931408-16801" deleted
namespace "namespace-1602931410-5470" deleted
namespace "namespace-1602931412-6711" deleted
namespace "namespace-1602931414-9771" deleted
namespace "namespace-1602931447-10938" deleted
Error from server (Forbidden): namespaces "default" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-public" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-system" is forbidden: this namespace may not be deleted
has:warning: deleting cluster-scoped resources
Successful
message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
namespace "kube-node-lease" deleted
namespace "my-namespace" deleted
namespace "namespace-1602931253-12075" deleted
... skipping 29 lines ...
namespace "namespace-1602931408-13143" deleted
namespace "namespace-1602931408-16801" deleted
namespace "namespace-1602931410-5470" deleted
namespace "namespace-1602931412-6711" deleted
namespace "namespace-1602931414-9771" deleted
namespace "namespace-1602931447-10938" deleted
Error from server (Forbidden): namespaces "default" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-public" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-system" is forbidden: this namespace may not be deleted
has:namespace "my-namespace" deleted
namespace/quotas created
core.sh:1475: Successful get namespaces/quotas {{.metadata.name}}: quotas
(Bcore.sh:1476: Successful get quota --namespace=quotas {{range.items}}{{ if eq .metadata.name \"test-quota\" }}found{{end}}{{end}}:: :
(Bv1
resourcequota/test-quota created (dry run)
... skipping 4 lines ...
core.sh:1480: Successful get quota --namespace=quotas {{range.items}}{{ if eq .metadata.name \"test-quota\" }}found{{end}}{{end}}:: :
(Bv1
resourcequota/test-quota created
core.sh:1483: Successful get quota --namespace=quotas {{range.items}}{{ if eq .metadata.name \"test-quota\" }}found{{end}}{{end}}:: found:
(Bresourcequota "test-quota" deleted
namespace "quotas" deleted
E1017 10:44:26.273855   57975 resource_quota_controller.go:252] Operation cannot be fulfilled on resourcequotas "test-quota": StorageError: invalid object, Code: 4, Key: /registry/resourcequotas/quotas/test-quota, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 177547d4-19dd-461f-8f92-36d73abd7d0a, UID in object meta: 
I1017 10:44:26.274020   57975 shared_informer.go:240] Waiting for caches to sync for resource quota
I1017 10:44:26.274076   57975 shared_informer.go:247] Caches are synced for resource quota 
I1017 10:44:26.274112   57975 resource_quota_controller.go:306] Resource quota has been deleted quotas/test-quota
I1017 10:44:26.279119   57975 resource_quota_controller.go:306] Resource quota has been deleted quotas/test-quota
E1017 10:44:27.834358   57975 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1017 10:44:28.004016   57975 horizontal.go:354] Horizontal Pod Autoscaler busybox0 has been deleted in namespace-1602931447-10938
I1017 10:44:28.007925   57975 horizontal.go:354] Horizontal Pod Autoscaler busybox1 has been deleted in namespace-1602931447-10938
E1017 10:44:28.783268   57975 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1017 10:44:28.824627   57975 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1017 10:44:30.201868   57975 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1495: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"other\" }}found{{end}}{{end}}:: :
(Bnamespace/other created
core.sh:1499: Successful get namespaces/other {{.metadata.name}}: other
(Bcore.sh:1503: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/valid-pod created
core.sh:1507: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bcore.sh:1509: Successful get pods -n other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BSuccessful
message:error: a resource cannot be retrieved by name across all namespaces
has:a resource cannot be retrieved by name across all namespaces
core.sh:1516: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "valid-pod" force deleted
core.sh:1520: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: 
(Bnamespace "other" deleted
... skipping 146 lines ...
configmap/test-configmap created (server dry run)
core.sh:46: Successful get configmaps {{range.items}}{{ if eq .metadata.name \"test-configmap\" }}found{{end}}{{end}}:: :
(Bconfigmap/test-configmap created
configmap/test-binary-configmap created
core.sh:51: Successful get configmap/test-configmap --namespace=test-configmaps {{.metadata.name}}: test-configmap
(Bcore.sh:52: Successful get configmap/test-binary-configmap --namespace=test-configmaps {{.metadata.name}}: test-binary-configmap
(BE1017 10:44:48.129074   57975 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
configmap "test-configmap" deleted
configmap "test-binary-configmap" deleted
namespace "test-configmaps" deleted
E1017 10:44:50.438503   57975 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1017 10:44:51.365198   57975 namespace_controller.go:185] Namespace has been deleted test-secrets
E1017 10:44:53.061682   57975 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_client_config_tests
Running command: run_client_config_tests

+++ Running case: test-cmd.run_client_config_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_client_config_tests
+++ [1017 10:44:53] Creating namespace namespace-1602931493-18288
namespace/namespace-1602931493-18288 created
Context "test" modified.
+++ [1017 10:44:53] Testing client config
Successful
message:error: stat missing: no such file or directory
has:missing: no such file or directory
Successful
message:error: stat missing: no such file or directory
has:missing: no such file or directory
Successful
message:error: stat missing: no such file or directory
has:missing: no such file or directory
Successful
message:Error in configuration: context was not found for specified context: missing-context
has:context was not found for specified context: missing-context
E1017 10:44:54.194221   57975 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: no server found for cluster "missing-cluster"
has:no server found for cluster "missing-cluster"
Successful
message:error: auth info "missing-user" does not exist
has:auth info "missing-user" does not exist
Successful
message:error: error loading config file "/tmp/newconfig.yaml": no kind "Config" is registered for version "v-1" in scheme "k8s.io/client-go/tools/clientcmd/api/latest/latest.go:50"
has:error loading config file
Successful
message:error: stat missing-config: no such file or directory
has:no such file or directory
+++ exit code: 0
Recording: run_service_accounts_tests
Running command: run_service_accounts_tests

+++ Running case: test-cmd.run_service_accounts_tests 
... skipping 43 lines ...
Labels:                        <none>
Annotations:                   <none>
Schedule:                      59 23 31 2 *
Concurrency Policy:            Allow
Suspend:                       False
Successful Job History Limit:  3
Failed Job History Limit:      1
Starting Deadline Seconds:     <unset>
Selector:                      <unset>
Parallelism:                   <unset>
Completions:                   <unset>
Pod Template:
  Labels:  <none>
... skipping 37 lines ...
Labels:         controller-uid=bda06694-23c0-44f6-a809-a43f341b5a61
                job-name=test-job
Annotations:    cronjob.kubernetes.io/instantiate: manual
Parallelism:    1
Completions:    1
Start Time:     Sat, 17 Oct 2020 10:45:02 +0000
Pods Statuses:  1 Running / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  controller-uid=bda06694-23c0-44f6-a809-a43f341b5a61
           job-name=test-job
  Containers:
   pi:
    Image:      k8s.gcr.io/perl
... skipping 460 lines ...
  type: ClusterIP
status:
  loadBalancer: {}
Successful
message:kubectl-create kubectl-set
has:kubectl-set
error: you must specify resources by --filename when --local is set.
Example resource specifications include:
   '-f rsrc.yaml'
   '--filename=rsrc.json'
core.sh:1020: Successful get services redis-master {{range.spec.selector}}{{.}}:{{end}}: redis:master:backend:
(BI1017 10:45:13.549000   57975 namespace_controller.go:185] Namespace has been deleted test-jobs
service/redis-master selector updated
Successful
message:Error from server (Conflict): Operation cannot be fulfilled on services "redis-master": the object has been modified; please apply your changes to the latest version and try again
has:Conflict
core.sh:1033: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
(Bservice "redis-master" deleted
core.sh:1040: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bcore.sh:1044: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bservice/redis-master created
... skipping 41 lines ...
service/testmetadata created (server dry run)
pod/testmetadata created (server dry run)
core.sh:1148: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(BFlag --service-overrides has been deprecated, and will be removed in the future.
service/testmetadata created
pod/testmetadata created
E1017 10:45:18.011501   57975 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1152: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: testmetadata:
(Bcore.sh:1153: Successful get service testmetadata {{.metadata.annotations}}: map[zone-context:home]
(BSuccessful
message:kubectl-run
has:kubectl-run
service/exposemetadata exposed
... skipping 75 lines ...
(Bapps.sh:88: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:89: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(Bdaemonset.apps/bind rolled back
apps.sh:92: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(Bapps.sh:93: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(BSuccessful
message:error: unable to find specified revision 1000000 in history
has:unable to find specified revision
apps.sh:97: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(Bapps.sh:98: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(Bdaemonset.apps/bind rolled back
E1017 10:45:23.382080   57975 daemon_controller.go:320] namespace-1602931520-29090/bind failed with : error storing status for daemon set &v1.DaemonSet{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"bind", GenerateName:"", Namespace:"namespace-1602931520-29090", SelfLink:"", UID:"571483bd-3dc2-4c6c-9050-22b32d7a4b80", ResourceVersion:"1849", Generation:4, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63738528321, loc:(*time.Location)(0x6b39ea0)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string{"deprecated.daemonset.template.generation":"4", "kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"DaemonSet\",\"metadata\":{\"annotations\":{\"kubernetes.io/change-cause\":\"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true\"},\"labels\":{\"service\":\"bind\"},\"name\":\"bind\",\"namespace\":\"namespace-1602931520-29090\"},\"spec\":{\"selector\":{\"matchLabels\":{\"service\":\"bind\"}},\"template\":{\"metadata\":{\"labels\":{\"service\":\"bind\"}},\"spec\":{\"affinity\":{\"podAntiAffinity\":{\"requiredDuringSchedulingIgnoredDuringExecution\":[{\"labelSelector\":{\"matchExpressions\":[{\"key\":\"service\",\"operator\":\"In\",\"values\":[\"bind\"]}]},\"namespaces\":[],\"topologyKey\":\"kubernetes.io/hostname\"}]}},\"containers\":[{\"image\":\"k8s.gcr.io/pause:latest\",\"name\":\"kubernetes-pause\"},{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"app\"}]}},\"updateStrategy\":{\"rollingUpdate\":{\"maxUnavailable\":\"10%\"},\"type\":\"RollingUpdate\"}}}\n", "kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"}, OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry{v1.ManagedFieldsEntry{Manager:"kube-controller-manager", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc0005d0700), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc0005d07c0)}, v1.ManagedFieldsEntry{Manager:"kubectl-client-side-apply", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc0005d0820), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc0005d08e0)}, v1.ManagedFieldsEntry{Manager:"kubectl", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc0005d09e0), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc0005d0a00)}}}, Spec:v1.DaemonSetSpec{Selector:(*v1.LabelSelector)(0xc0005d0a40), Template:v1.PodTemplateSpec{ObjectMeta:v1.ObjectMeta{Name:"", GenerateName:"", Namespace:"", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:0, loc:(*time.Location)(nil)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"kubernetes-pause", Image:"k8s.gcr.io/pause:latest", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}, v1.Container{Name:"app", Image:"k8s.gcr.io/nginx:test-cmd", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]v1.EphemeralContainer(nil), RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc002855308), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc0027aa3f0), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(0xc0005d0be0), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration(nil), HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(nil), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(nil), PreemptionPolicy:(*v1.PreemptionPolicy)(nil), Overhead:v1.ResourceList(nil), TopologySpreadConstraints:[]v1.TopologySpreadConstraint(nil), SetHostnameAsFQDN:(*bool)(nil)}}, UpdateStrategy:v1.DaemonSetUpdateStrategy{Type:"RollingUpdate", RollingUpdate:(*v1.RollingUpdateDaemonSet)(0xc000388c58)}, MinReadySeconds:0, RevisionHistoryLimit:(*int32)(0xc0028553cc)}, Status:v1.DaemonSetStatus{CurrentNumberScheduled:0, NumberMisscheduled:0, DesiredNumberScheduled:0, NumberReady:0, ObservedGeneration:3, UpdatedNumberScheduled:0, NumberAvailable:0, NumberUnavailable:0, CollisionCount:(*int32)(nil), Conditions:[]v1.DaemonSetCondition(nil)}}: Operation cannot be fulfilled on daemonsets.apps "bind": the object has been modified; please apply your changes to the latest version and try again
apps.sh:101: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:latest:
(Bapps.sh:102: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:103: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(Bdaemonset.apps "bind" deleted
+++ exit code: 0
Recording: run_rc_tests
Running command: run_rc_tests

+++ Running case: test-cmd.run_rc_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_rc_tests
+++ [1017 10:45:23] Creating namespace namespace-1602931523-30657
E1017 10:45:23.882802   57975 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1602931523-30657 created
Context "test" modified.
+++ [1017 10:45:23] Testing kubectl(v1:replicationcontrollers)
core.sh:1191: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicationcontroller/frontend created
I1017 10:45:24.257487   57975 event.go:291] "Event occurred" object="namespace-1602931523-30657/frontend" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-vph9t"
... skipping 20 lines ...
Namespace:    namespace-1602931523-30657
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
Namespace:    namespace-1602931523-30657
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 18 lines ...
Namespace:    namespace-1602931523-30657
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 12 lines ...
Namespace:    namespace-1602931523-30657
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 10 lines ...
  Type    Reason            Age   From                    Message
  ----    ------            ----  ----                    -------
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-9vpm5
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-7kkb4
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-5zkwn
(B
E1017 10:45:25.362821   57975 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched Name:
matched Name:
matched Pod Template:
matched Labels:
matched Selector:
matched Replicas:
... skipping 5 lines ...
Namespace:    namespace-1602931523-30657
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
Namespace:    namespace-1602931523-30657
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
Namespace:    namespace-1602931523-30657
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 11 lines ...
Namespace:    namespace-1602931523-30657
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 15 lines ...
(Bcore.sh:1224: Successful get rc frontend {{.spec.replicas}}: 3
(Breplicationcontroller/frontend scaled
E1017 10:45:25.928298   57975 replica_set.go:201] ReplicaSet has no controller: &ReplicaSet{ObjectMeta:{frontend  namespace-1602931523-30657  a31ce9c3-b2a3-4fa0-aa7f-2b212a74cbab 1883 2 2020-10-17 10:45:24 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  [{kube-controller-manager Update v1 2020-10-17 10:45:24 +0000 UTC FieldsV1 {"f:status":{"f:fullyLabeledReplicas":{},"f:observedGeneration":{},"f:replicas":{}}}} {kubectl-create Update v1 2020-10-17 10:45:24 +0000 UTC FieldsV1 {"f:metadata":{"f:labels":{".":{},"f:app":{},"f:tier":{}}},"f:spec":{"f:replicas":{},"f:selector":{".":{},"f:app":{},"f:tier":{}},"f:template":{".":{},"f:metadata":{".":{},"f:creationTimestamp":{},"f:labels":{".":{},"f:app":{},"f:tier":{}}},"f:spec":{".":{},"f:containers":{".":{},"k:{\"name\":\"php-redis\"}":{".":{},"f:env":{".":{},"k:{\"name\":\"GET_HOSTS_FROM\"}":{".":{},"f:name":{},"f:value":{}}},"f:image":{},"f:imagePullPolicy":{},"f:name":{},"f:ports":{".":{},"k:{\"containerPort\":80,\"protocol\":\"TCP\"}":{".":{},"f:containerPort":{},"f:protocol":{}}},"f:resources":{".":{},"f:requests":{".":{},"f:cpu":{},"f:memory":{}}},"f:terminationMessagePath":{},"f:terminationMessagePolicy":{}}},"f:dnsPolicy":{},"f:restartPolicy":{},"f:schedulerName":{},"f:securityContext":{},"f:terminationGracePeriodSeconds":{}}}}}}]},Spec:ReplicaSetSpec{Replicas:*2,Selector:&v1.LabelSelector{MatchLabels:map[string]string{app: guestbook,tier: frontend,},MatchExpressions:[]LabelSelectorRequirement{},},Template:{{      0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []} {[] [] [{php-redis gcr.io/google_samples/gb-frontend:v4 [] []  [{ 0 80 TCP }] [] [{GET_HOSTS_FROM dns nil}] {map[] map[cpu:{{100 -3} {<nil>} 100m DecimalSI} memory:{{104857600 0} {<nil>} 100Mi BinarySI}]} [] [] nil nil nil nil /dev/termination-log File IfNotPresent nil false false false}] [] Always 0xc00268eb68 <nil> ClusterFirst map[]   <nil>  false false false <nil> PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,FSGroupChangePolicy:nil,SeccompProfile:nil,} []   nil default-scheduler [] []  <nil> nil [] <nil> <nil> <nil> map[] [] <nil>}},MinReadySeconds:0,},Status:ReplicaSetStatus{Replicas:3,FullyLabeledReplicas:3,ObservedGeneration:1,ReadyReplicas:0,AvailableReplicas:0,Conditions:[]ReplicaSetCondition{},},}
I1017 10:45:25.936288   57975 event.go:291] "Event occurred" object="namespace-1602931523-30657/frontend" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulDelete" message="Deleted pod: frontend-7kkb4"
core.sh:1228: Successful get rc frontend {{.spec.replicas}}: 2
(Bcore.sh:1232: Successful get rc frontend {{.spec.replicas}}: 2
(Berror: Expected replicas to be 3, was 2
core.sh:1236: Successful get rc frontend {{.spec.replicas}}: 2
(Bcore.sh:1240: Successful get rc frontend {{.spec.replicas}}: 2
(Breplicationcontroller/frontend scaled
I1017 10:45:26.481058   57975 event.go:291] "Event occurred" object="namespace-1602931523-30657/frontend" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-6k8tr"
core.sh:1244: Successful get rc frontend {{.spec.replicas}}: 3
(BE1017 10:45:26.640350   57975 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1248: Successful get rc frontend {{.spec.replicas}}: 3
(BE1017 10:45:26.764391   57975 replica_set.go:201] ReplicaSet has no controller: &ReplicaSet{ObjectMeta:{frontend  namespace-1602931523-30657  a31ce9c3-b2a3-4fa0-aa7f-2b212a74cbab 1895 4 2020-10-17 10:45:24 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  [{kube-controller-manager Update v1 2020-10-17 10:45:24 +0000 UTC FieldsV1 {"f:status":{"f:fullyLabeledReplicas":{},"f:observedGeneration":{},"f:replicas":{}}}} {kubectl-create Update v1 2020-10-17 10:45:24 +0000 UTC FieldsV1 {"f:metadata":{"f:labels":{".":{},"f:app":{},"f:tier":{}}},"f:spec":{"f:replicas":{},"f:selector":{".":{},"f:app":{},"f:tier":{}},"f:template":{".":{},"f:metadata":{".":{},"f:creationTimestamp":{},"f:labels":{".":{},"f:app":{},"f:tier":{}}},"f:spec":{".":{},"f:containers":{".":{},"k:{\"name\":\"php-redis\"}":{".":{},"f:env":{".":{},"k:{\"name\":\"GET_HOSTS_FROM\"}":{".":{},"f:name":{},"f:value":{}}},"f:image":{},"f:imagePullPolicy":{},"f:name":{},"f:ports":{".":{},"k:{\"containerPort\":80,\"protocol\":\"TCP\"}":{".":{},"f:containerPort":{},"f:protocol":{}}},"f:resources":{".":{},"f:requests":{".":{},"f:cpu":{},"f:memory":{}}},"f:terminationMessagePath":{},"f:terminationMessagePolicy":{}}},"f:dnsPolicy":{},"f:restartPolicy":{},"f:schedulerName":{},"f:securityContext":{},"f:terminationGracePeriodSeconds":{}}}}}}]},Spec:ReplicaSetSpec{Replicas:*2,Selector:&v1.LabelSelector{MatchLabels:map[string]string{app: guestbook,tier: frontend,},MatchExpressions:[]LabelSelectorRequirement{},},Template:{{      0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []} {[] [] [{php-redis gcr.io/google_samples/gb-frontend:v4 [] []  [{ 0 80 TCP }] [] [{GET_HOSTS_FROM dns nil}] {map[] map[cpu:{{100 -3} {<nil>} 100m DecimalSI} memory:{{104857600 0} {<nil>} 100Mi BinarySI}]} [] [] nil nil nil nil /dev/termination-log File IfNotPresent nil false false false}] [] Always 0xc002a8cf48 <nil> ClusterFirst map[]   <nil>  false false false <nil> PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,FSGroupChangePolicy:nil,SeccompProfile:nil,} []   nil default-scheduler [] []  <nil> nil [] <nil> <nil> <nil> map[] [] <nil>}},MinReadySeconds:0,},Status:ReplicaSetStatus{Replicas:3,FullyLabeledReplicas:3,ObservedGeneration:3,ReadyReplicas:0,AvailableReplicas:0,Conditions:[]ReplicaSetCondition{},},}
replicationcontroller/frontend scaled
I1017 10:45:26.772570   57975 event.go:291] "Event occurred" object="namespace-1602931523-30657/frontend" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulDelete" message="Deleted pod: frontend-6k8tr"
core.sh:1252: Successful get rc frontend {{.spec.replicas}}: 2
(Breplicationcontroller "frontend" deleted
... skipping 26 lines ...
(Bdeployment.apps "nginx-deployment" deleted
Successful
message:service/expose-test-deployment exposed
has:service/expose-test-deployment exposed
service "expose-test-deployment" deleted
Successful
message:error: couldn't retrieve selectors via --selector flag or introspection: invalid deployment: no selectors, therefore cannot be exposed
See 'kubectl expose -h' for help and examples
has:invalid deployment: no selectors
deployment.apps/nginx-deployment created
I1017 10:45:28.652713   57975 event.go:291] "Event occurred" object="namespace-1602931523-30657/nginx-deployment" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-deployment-76b5cd66f5 to 3"
I1017 10:45:28.656457   57975 event.go:291] "Event occurred" object="namespace-1602931523-30657/nginx-deployment-76b5cd66f5" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-76b5cd66f5-qnbs8"
I1017 10:45:28.659921   57975 event.go:291] "Event occurred" object="namespace-1602931523-30657/nginx-deployment-76b5cd66f5" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-76b5cd66f5-dblmk"
... skipping 23 lines ...
service "frontend" deleted
service "frontend-2" deleted
service "frontend-3" deleted
service "frontend-4" deleted
service "frontend-5" deleted
Successful
message:error: cannot expose a Node
has:cannot expose
Successful
message:The Service "invalid-large-service-name-that-has-more-than-sixty-three-characters" is invalid: metadata.name: Invalid value: "invalid-large-service-name-that-has-more-than-sixty-three-characters": must be no more than 63 characters
has:metadata.name: Invalid value
Successful
message:service/kubernetes-serve-hostname-testing-sixty-three-characters-in-len exposed
... skipping 30 lines ...
(Bhorizontalpodautoscaler.autoscaling/frontend autoscaled
core.sh:1391: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 70
(Bhorizontalpodautoscaler.autoscaling "frontend" deleted
horizontalpodautoscaler.autoscaling/frontend autoscaled
core.sh:1395: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
(Bhorizontalpodautoscaler.autoscaling "frontend" deleted
Error: required flag(s) "max" not set
replicationcontroller "frontend" deleted
core.sh:1404: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BapiVersion: apps/v1
kind: Deployment
metadata:
  creationTimestamp: null
... skipping 24 lines ...
          limits:
            cpu: 300m
          requests:
            cpu: 300m
      terminationGracePeriodSeconds: 0
status: {}
Error from server (NotFound): deployments.apps "nginx-deployment-resources" not found
deployment.apps/nginx-deployment-resources created
I1017 10:45:34.125934   57975 event.go:291] "Event occurred" object="namespace-1602931523-30657/nginx-deployment-resources" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-deployment-resources-748ddcb48b to 3"
I1017 10:45:34.131991   57975 event.go:291] "Event occurred" object="namespace-1602931523-30657/nginx-deployment-resources-748ddcb48b" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-resources-748ddcb48b-dpx4w"
I1017 10:45:34.139710   57975 event.go:291] "Event occurred" object="namespace-1602931523-30657/nginx-deployment-resources-748ddcb48b" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-resources-748ddcb48b-q6b5q"
I1017 10:45:34.139758   57975 event.go:291] "Event occurred" object="namespace-1602931523-30657/nginx-deployment-resources-748ddcb48b" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-resources-748ddcb48b-59jq4"
core.sh:1410: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment-resources:
(Bcore.sh:1411: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bcore.sh:1412: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Bdeployment.apps/nginx-deployment-resources resource requirements updated
I1017 10:45:34.511278   57975 event.go:291] "Event occurred" object="namespace-1602931523-30657/nginx-deployment-resources" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-deployment-resources-7bfb7d56b6 to 1"
I1017 10:45:34.514338   57975 event.go:291] "Event occurred" object="namespace-1602931523-30657/nginx-deployment-resources-7bfb7d56b6" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-resources-7bfb7d56b6-qqstl"
core.sh:1415: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 100m:
(Bcore.sh:1416: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 100m:
(Berror: unable to find container named redis
deployment.apps/nginx-deployment-resources resource requirements updated
I1017 10:45:34.886785   57975 event.go:291] "Event occurred" object="namespace-1602931523-30657/nginx-deployment-resources" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled down replica set nginx-deployment-resources-748ddcb48b to 2"
I1017 10:45:34.893683   57975 event.go:291] "Event occurred" object="namespace-1602931523-30657/nginx-deployment-resources-748ddcb48b" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulDelete" message="Deleted pod: nginx-deployment-resources-748ddcb48b-dpx4w"
I1017 10:45:34.898856   57975 event.go:291] "Event occurred" object="namespace-1602931523-30657/nginx-deployment-resources" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-deployment-resources-75dbcccf44 to 1"
I1017 10:45:34.906315   57975 event.go:291] "Event occurred" object="namespace-1602931523-30657/nginx-deployment-resources-75dbcccf44" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-resources-75dbcccf44-k5vlp"
core.sh:1421: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
... skipping 390 lines ...
    status: "True"
    type: Progressing
  observedGeneration: 4
  replicas: 4
  unavailableReplicas: 4
  updatedReplicas: 1
error: you must specify resources by --filename when --local is set.
Example resource specifications include:
   '-f rsrc.yaml'
   '--filename=rsrc.json'
core.sh:1432: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
(Bcore.sh:1433: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 300m:
(Bcore.sh:1434: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.requests.cpu}}:{{end}}: 300m:
... skipping 46 lines ...
                pod-template-hash=69dd6dcd84
Annotations:    deployment.kubernetes.io/desired-replicas: 1
                deployment.kubernetes.io/max-replicas: 2
                deployment.kubernetes.io/revision: 1
Controlled By:  Deployment/test-nginx-apps
Replicas:       1 current / 1 desired
Pods Status:    0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=test-nginx-apps
           pod-template-hash=69dd6dcd84
  Containers:
   nginx:
    Image:        k8s.gcr.io/nginx:test-cmd
... skipping 102 lines ...
apps.sh:305: Successful get deployment.apps {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(B    Image:	k8s.gcr.io/nginx:test-cmd
deployment.apps/nginx rolled back (server dry run)
apps.sh:309: Successful get deployment.apps {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(Bdeployment.apps/nginx rolled back
apps.sh:313: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Berror: unable to find specified revision 1000000 in history
apps.sh:316: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bdeployment.apps/nginx rolled back
apps.sh:320: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(Bdeployment.apps/nginx paused
error: you cannot rollback a paused deployment; resume it first with 'kubectl rollout resume deployment/nginx' and try again
error: deployments.apps "nginx" can't restart paused deployment (run rollout resume first)
deployment.apps/nginx resumed
deployment.apps/nginx rolled back
    deployment.kubernetes.io/revision-history: 1,3
error: desired revision (3) is different from the running revision (5)
deployment.apps/nginx restarted
I1017 10:45:46.244478   57975 event.go:291] "Event occurred" object="namespace-1602931536-22821/nginx" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled down replica set nginx-54785cbcb8 to 2"
I1017 10:45:46.255557   57975 event.go:291] "Event occurred" object="namespace-1602931536-22821/nginx-54785cbcb8" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulDelete" message="Deleted pod: nginx-54785cbcb8-t8488"
I1017 10:45:46.257544   57975 event.go:291] "Event occurred" object="namespace-1602931536-22821/nginx" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-fb66667c8 to 1"
I1017 10:45:46.261479   57975 event.go:291] "Event occurred" object="namespace-1602931536-22821/nginx-fb66667c8" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-fb66667c8-xjqtm"
Successful
... skipping 148 lines ...
(Bapps.sh:364: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Bdeployment.apps/nginx-deployment image updated
I1017 10:45:49.060924   57975 event.go:291] "Event occurred" object="namespace-1602931536-22821/nginx-deployment" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-deployment-6dd48b9849 to 1"
I1017 10:45:49.065525   57975 event.go:291] "Event occurred" object="namespace-1602931536-22821/nginx-deployment-6dd48b9849" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-6dd48b9849-bdzh2"
apps.sh:367: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(Bapps.sh:368: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Berror: unable to find container named "redis"
deployment.apps/nginx-deployment image updated
apps.sh:373: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:374: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Bdeployment.apps/nginx-deployment image updated
apps.sh:377: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(Bapps.sh:378: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
... skipping 60 lines ...

+++ Running case: test-cmd.run_rs_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_rs_tests
+++ [1017 10:45:53] Creating namespace namespace-1602931553-28459
namespace/namespace-1602931553-28459 created
E1017 10:45:53.879948   57975 replica_set.go:532] sync "namespace-1602931536-22821/nginx-deployment-68d657fb6" failed with Operation cannot be fulfilled on replicasets.apps "nginx-deployment-68d657fb6": StorageError: invalid object, Code: 4, Key: /registry/replicasets/namespace-1602931536-22821/nginx-deployment-68d657fb6, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 8412e0fd-9dca-4da0-99e3-ce7eec7cbc9c, UID in object meta: 
Context "test" modified.
+++ [1017 10:45:53] Testing kubectl(v1:replicasets)
E1017 10:45:53.979062   57975 replica_set.go:532] sync "namespace-1602931536-22821/nginx-deployment-59b7fccd97" failed with replicasets.apps "nginx-deployment-59b7fccd97" not found
apps.sh:541: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1017 10:45:54.029134   57975 replica_set.go:532] sync "namespace-1602931536-22821/nginx-deployment-57ddd474c4" failed with replicasets.apps "nginx-deployment-57ddd474c4" not found
replicaset.apps/frontend created
I1017 10:45:54.192561   57975 event.go:291] "Event occurred" object="namespace-1602931553-28459/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-h4rj9"
+++ [1017 10:45:54] Deleting rs
I1017 10:45:54.195895   57975 event.go:291] "Event occurred" object="namespace-1602931553-28459/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-72vg8"
I1017 10:45:54.199156   57975 event.go:291] "Event occurred" object="namespace-1602931553-28459/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-j7gb7"
replicaset.apps "frontend" deleted
apps.sh:547: Successful get pods -l "tier=frontend" {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:551: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1017 10:45:54.479259   57975 replica_set.go:532] sync "namespace-1602931553-28459/frontend" failed with replicasets.apps "frontend" not found
replicaset.apps/frontend created
I1017 10:45:54.668877   57975 event.go:291] "Event occurred" object="namespace-1602931553-28459/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-9jddr"
I1017 10:45:54.674679   57975 event.go:291] "Event occurred" object="namespace-1602931553-28459/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-9mqb5"
I1017 10:45:54.680615   57975 event.go:291] "Event occurred" object="namespace-1602931553-28459/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-47hq4"
apps.sh:555: Successful get pods -l "tier=frontend" {{range.items}}{{(index .spec.containers 0).name}}:{{end}}: php-redis:php-redis:php-redis:
(B+++ [1017 10:45:54] Deleting rs
E1017 10:45:54.783288   57975 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps "frontend" deleted
E1017 10:45:54.929141   57975 replica_set.go:532] sync "namespace-1602931553-28459/frontend" failed with replicasets.apps "frontend" not found
apps.sh:559: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:561: Successful get pods -l "tier=frontend" {{range.items}}{{(index .spec.containers 0).name}}:{{end}}: php-redis:php-redis:php-redis:
(Bpod "frontend-47hq4" deleted
pod "frontend-9jddr" deleted
pod "frontend-9mqb5" deleted
apps.sh:564: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
... skipping 15 lines ...
Namespace:    namespace-1602931553-28459
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
Namespace:    namespace-1602931553-28459
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 18 lines ...
Namespace:    namespace-1602931553-28459
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 12 lines ...
Namespace:    namespace-1602931553-28459
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 26 lines ...
Namespace:    namespace-1602931553-28459
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
Namespace:    namespace-1602931553-28459
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
Namespace:    namespace-1602931553-28459
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 11 lines ...
Namespace:    namespace-1602931553-28459
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 181 lines ...
(Breplicaset.apps/frontend resource requirements updated
apps.sh:657: Successful get rs frontend {{.metadata.generation}}: 4
(Breplicaset.apps/frontend serviceaccount updated (dry run)
replicaset.apps/frontend serviceaccount updated (server dry run)
apps.sh:660: Successful get rs frontend {{.metadata.generation}}: 4
(Breplicaset.apps/frontend serviceaccount updated
E1017 10:46:01.858087   57975 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:662: Successful get rs frontend {{.metadata.generation}}: 5
(BSuccessful
message:kube-controller-manager kubectl-create kubectl-set
has:kubectl-set
apps.sh:670: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(Breplicaset.apps "frontend" deleted
... skipping 23 lines ...
horizontalpodautoscaler.autoscaling/frontend autoscaled
apps.sh:706: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
(BSuccessful
message:kubectl-autoscale
has:kubectl-autoscale
horizontalpodautoscaler.autoscaling "frontend" deleted
Error: required flag(s) "max" not set
replicaset.apps "frontend" deleted
+++ exit code: 0
Recording: run_stateful_set_tests
Running command: run_stateful_set_tests

+++ Running case: test-cmd.run_stateful_set_tests 
... skipping 61 lines ...
(Bapps.sh:466: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(Bapps.sh:467: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(Bstatefulset.apps/nginx rolled back
apps.sh:470: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
(Bapps.sh:471: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(BSuccessful
message:error: unable to find specified revision 1000000 in history
has:unable to find specified revision
apps.sh:475: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
(Bapps.sh:476: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(Bstatefulset.apps/nginx rolled back
apps.sh:479: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.8:
(Bapps.sh:480: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
... skipping 58 lines ...
Name:         mock
Namespace:    namespace-1602931569-16260
Selector:     app=mock
Labels:       app=mock
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 15 lines ...
(Bservice/mock edited
replicationcontroller/mock edited
generic-resources.sh:114: Successful get services mock {{.metadata.labels.status}}: edited
(Bgeneric-resources.sh:120: Successful get rc mock {{.metadata.labels.status}}: edited
(Bservice/mock labeled
replicationcontroller/mock labeled
E1017 10:46:11.719561   57975 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:134: Successful get services mock {{.metadata.labels.labeled}}: true
(BE1017 10:46:11.842564   57975 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:140: Successful get rc mock {{.metadata.labels.labeled}}: true
(Bservice/mock annotated
replicationcontroller/mock annotated
generic-resources.sh:153: Successful get services mock {{.metadata.annotations.annotated}}: true
(Bgeneric-resources.sh:159: Successful get rc mock {{.metadata.annotations.annotated}}: true
(Bservice "mock" deleted
... skipping 28 lines ...
Name:         mock
Namespace:    namespace-1602931569-16260
Selector:     app=mock
Labels:       app=mock
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 56 lines ...
Name:         mock
Namespace:    namespace-1602931569-16260
Selector:     app=mock
Labels:       app=mock
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 41 lines ...
Namespace:    namespace-1602931569-16260
Selector:     app=mock
Labels:       app=mock
              status=replaced
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 11 lines ...
Namespace:    namespace-1602931569-16260
Selector:     app=mock2
Labels:       app=mock2
              status=replaced
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock2
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 196 lines ...
Roles:              <none>
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
CreationTimestamp:  Sat, 17 Oct 2020 10:40:51 +0000
Taints:             node.kubernetes.io/unreachable:NoSchedule
Unschedulable:      false
Lease:              Failed to get lease: leases.coordination.k8s.io "127.0.0.1" not found
Conditions:
  Type             Status    LastHeartbeatTime                 LastTransitionTime                Reason                   Message
  ----             ------    -----------------                 ------------------                ------                   -------
  Ready            Unknown   Sat, 17 Oct 2020 10:40:51 +0000   Sat, 17 Oct 2020 10:41:52 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  MemoryPressure   Unknown   Sat, 17 Oct 2020 10:40:51 +0000   Sat, 17 Oct 2020 10:41:52 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  DiskPressure     Unknown   Sat, 17 Oct 2020 10:40:51 +0000   Sat, 17 Oct 2020 10:41:52 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
... skipping 30 lines ...
Roles:              <none>
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
CreationTimestamp:  Sat, 17 Oct 2020 10:40:51 +0000
Taints:             node.kubernetes.io/unreachable:NoSchedule
Unschedulable:      false
Lease:              Failed to get lease: leases.coordination.k8s.io "127.0.0.1" not found
Conditions:
  Type             Status    LastHeartbeatTime                 LastTransitionTime                Reason                   Message
  ----             ------    -----------------                 ------------------                ------                   -------
  Ready            Unknown   Sat, 17 Oct 2020 10:40:51 +0000   Sat, 17 Oct 2020 10:41:52 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  MemoryPressure   Unknown   Sat, 17 Oct 2020 10:40:51 +0000   Sat, 17 Oct 2020 10:41:52 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  DiskPressure     Unknown   Sat, 17 Oct 2020 10:40:51 +0000   Sat, 17 Oct 2020 10:41:52 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
... skipping 31 lines ...
Roles:              <none>
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
CreationTimestamp:  Sat, 17 Oct 2020 10:40:51 +0000
Taints:             node.kubernetes.io/unreachable:NoSchedule
Unschedulable:      false
Lease:              Failed to get lease: leases.coordination.k8s.io "127.0.0.1" not found
Conditions:
  Type             Status    LastHeartbeatTime                 LastTransitionTime                Reason                   Message
  ----             ------    -----------------                 ------------------                ------                   -------
  Ready            Unknown   Sat, 17 Oct 2020 10:40:51 +0000   Sat, 17 Oct 2020 10:41:52 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  MemoryPressure   Unknown   Sat, 17 Oct 2020 10:40:51 +0000   Sat, 17 Oct 2020 10:41:52 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  DiskPressure     Unknown   Sat, 17 Oct 2020 10:40:51 +0000   Sat, 17 Oct 2020 10:41:52 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
... skipping 30 lines ...
Roles:              <none>
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
CreationTimestamp:  Sat, 17 Oct 2020 10:40:51 +0000
Taints:             node.kubernetes.io/unreachable:NoSchedule
Unschedulable:      false
Lease:              Failed to get lease: leases.coordination.k8s.io "127.0.0.1" not found
Conditions:
  Type             Status    LastHeartbeatTime                 LastTransitionTime                Reason                   Message
  ----             ------    -----------------                 ------------------                ------                   -------
  Ready            Unknown   Sat, 17 Oct 2020 10:40:51 +0000   Sat, 17 Oct 2020 10:41:52 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  MemoryPressure   Unknown   Sat, 17 Oct 2020 10:40:51 +0000   Sat, 17 Oct 2020 10:41:52 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  DiskPressure     Unknown   Sat, 17 Oct 2020 10:40:51 +0000   Sat, 17 Oct 2020 10:41:52 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
... skipping 38 lines ...
Roles:              <none>
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
CreationTimestamp:  Sat, 17 Oct 2020 10:40:51 +0000
Taints:             node.kubernetes.io/unreachable:NoSchedule
Unschedulable:      false
Lease:              Failed to get lease: leases.coordination.k8s.io "127.0.0.1" not found
Conditions:
  Type             Status    LastHeartbeatTime                 LastTransitionTime                Reason                   Message
  ----             ------    -----------------                 ------------------                ------                   -------
  Ready            Unknown   Sat, 17 Oct 2020 10:40:51 +0000   Sat, 17 Oct 2020 10:41:52 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  MemoryPressure   Unknown   Sat, 17 Oct 2020 10:40:51 +0000   Sat, 17 Oct 2020 10:41:52 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  DiskPressure     Unknown   Sat, 17 Oct 2020 10:40:51 +0000   Sat, 17 Oct 2020 10:41:52 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
... skipping 30 lines ...
Roles:              <none>
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
CreationTimestamp:  Sat, 17 Oct 2020 10:40:51 +0000
Taints:             node.kubernetes.io/unreachable:NoSchedule
Unschedulable:      false
Lease:              Failed to get lease: leases.coordination.k8s.io "127.0.0.1" not found
Conditions:
  Type             Status    LastHeartbeatTime                 LastTransitionTime                Reason                   Message
  ----             ------    -----------------                 ------------------                ------                   -------
  Ready            Unknown   Sat, 17 Oct 2020 10:40:51 +0000   Sat, 17 Oct 2020 10:41:52 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  MemoryPressure   Unknown   Sat, 17 Oct 2020 10:40:51 +0000   Sat, 17 Oct 2020 10:41:52 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  DiskPressure     Unknown   Sat, 17 Oct 2020 10:40:51 +0000   Sat, 17 Oct 2020 10:41:52 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
... skipping 30 lines ...
Roles:              <none>
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
CreationTimestamp:  Sat, 17 Oct 2020 10:40:51 +0000
Taints:             node.kubernetes.io/unreachable:NoSchedule
Unschedulable:      false
Lease:              Failed to get lease: leases.coordination.k8s.io "127.0.0.1" not found
Conditions:
  Type             Status    LastHeartbeatTime                 LastTransitionTime                Reason                   Message
  ----             ------    -----------------                 ------------------                ------                   -------
  Ready            Unknown   Sat, 17 Oct 2020 10:40:51 +0000   Sat, 17 Oct 2020 10:41:52 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  MemoryPressure   Unknown   Sat, 17 Oct 2020 10:40:51 +0000   Sat, 17 Oct 2020 10:41:52 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  DiskPressure     Unknown   Sat, 17 Oct 2020 10:40:51 +0000   Sat, 17 Oct 2020 10:41:52 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
... skipping 29 lines ...
Roles:              <none>
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
CreationTimestamp:  Sat, 17 Oct 2020 10:40:51 +0000
Taints:             node.kubernetes.io/unreachable:NoSchedule
Unschedulable:      false
Lease:              Failed to get lease: leases.coordination.k8s.io "127.0.0.1" not found
Conditions:
  Type             Status    LastHeartbeatTime                 LastTransitionTime                Reason                   Message
  ----             ------    -----------------                 ------------------                ------                   -------
  Ready            Unknown   Sat, 17 Oct 2020 10:40:51 +0000   Sat, 17 Oct 2020 10:41:52 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  MemoryPressure   Unknown   Sat, 17 Oct 2020 10:40:51 +0000   Sat, 17 Oct 2020 10:41:52 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  DiskPressure     Unknown   Sat, 17 Oct 2020 10:40:51 +0000   Sat, 17 Oct 2020 10:41:52 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
... skipping 138 lines ...
yes
has:the server doesn't have a resource type
Successful
message:yes
has:yes
Successful
message:error: --subresource can not be used with NonResourceURL
has:subresource can not be used with NonResourceURL
Successful
Successful
message:yes
0
has:0
... skipping 59 lines ...
		{Verbs:[get list watch] APIGroups:[] Resources:[configmaps] ResourceNames:[] NonResourceURLs:[]}
legacy-script.sh:840: Successful get rolebindings -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-RB:
(Blegacy-script.sh:841: Successful get roles -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-R:
(Blegacy-script.sh:842: Successful get clusterrolebindings -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-CRB:
(Blegacy-script.sh:843: Successful get clusterroles -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-CR:
(BSuccessful
message:error: only rbac.authorization.k8s.io/v1 is supported: not *v1beta1.ClusterRole
has:only rbac.authorization.k8s.io/v1 is supported
rolebinding.rbac.authorization.k8s.io "testing-RB" deleted
role.rbac.authorization.k8s.io "testing-R" deleted
warning: deleting cluster-scoped resources, not scoped to the provided namespace
clusterrole.rbac.authorization.k8s.io "testing-CR" deleted
clusterrolebinding.rbac.authorization.k8s.io "testing-CRB" deleted
... skipping 18 lines ...
Context "test" modified.
+++ [1017 10:46:33] Testing resource aliasing
replicationcontroller/cassandra created
I1017 10:46:33.205508   57975 event.go:291] "Event occurred" object="namespace-1602931592-29195/cassandra" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: cassandra-cptwx"
I1017 10:46:33.211643   57975 event.go:291] "Event occurred" object="namespace-1602931592-29195/cassandra" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: cassandra-p6gxz"
service/cassandra created
E1017 10:46:33.423342   57975 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
discovery.sh:91: Successful get all -l'app=cassandra' {{range.items}}{{range .metadata.labels}}{{.}}:{{end}}{{end}}: cassandra:cassandra:cassandra:cassandra:
(BI1017 10:46:33.651187   57975 event.go:291] "Event occurred" object="namespace-1602931592-29195/cassandra" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: cassandra-vrvjn"
pod "cassandra-cptwx" deleted
pod "cassandra-p6gxz" deleted
I1017 10:46:33.663269   57975 event.go:291] "Event occurred" object="namespace-1602931592-29195/cassandra" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: cassandra-l4vtt"
replicationcontroller "cassandra" deleted
E1017 10:46:33.670927   57975 replica_set.go:532] sync "namespace-1602931592-29195/cassandra" failed with replicationcontrollers "cassandra" not found
service "cassandra" deleted
+++ exit code: 0
Recording: run_kubectl_explain_tests
Running command: run_kubectl_explain_tests

+++ Running case: test-cmd.run_kubectl_explain_tests 
... skipping 119 lines ...
Running command: run_kubectl_sort_by_tests

+++ Running case: test-cmd.run_kubectl_sort_by_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_kubectl_sort_by_tests
+++ [1017 10:46:34] Testing kubectl --sort-by
E1017 10:46:34.833523   57975 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:256: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BNo resources found in namespace-1602931592-29195 namespace.
No resources found in namespace-1602931592-29195 namespace.
get.sh:264: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/valid-pod created
get.sh:268: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
... skipping 221 lines ...
namespace-1602931584-20817   default   0         14s
namespace-1602931592-29195   default   0         6s
some-other-random            default   0         7s
has:all-ns-test-2
namespace "all-ns-test-1" deleted
namespace "all-ns-test-2" deleted
E1017 10:46:47.912298   57975 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1017 10:46:48.874231   57975 namespace_controller.go:185] Namespace has been deleted all-ns-test-1
get.sh:376: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "valid-pod" force deleted
get.sh:380: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bget.sh:384: Successful get nodes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:
... skipping 771 lines ...
message:node/127.0.0.1 already uncordoned (server dry run)
has:already uncordoned
node-management.sh:145: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode/127.0.0.1 labeled
node-management.sh:150: Successful get nodes 127.0.0.1 {{.metadata.labels.test}}: label
(BSuccessful
message:error: cannot specify both a node name and a --selector option
See 'kubectl drain -h' for help and examples
has:cannot specify both a node name
Successful
message:error: USAGE: cordon NODE [flags]
See 'kubectl cordon -h' for help and examples
has:error\: USAGE\: cordon NODE
node/127.0.0.1 already uncordoned
Successful
message:error: You must provide one or more resources by argument or filename.
Example resource specifications include:
   '-f rsrc.yaml'
   '--filename=rsrc.json'
   '<resource> <name>'
   '<resource>'
has:must provide one or more resources
... skipping 17 lines ...
+++ [1017 10:47:03] Testing kubectl plugins
Successful
message:The following compatible plugins are available:

test/fixtures/pkg/kubectl/plugins/version/kubectl-version
  - warning: kubectl-version overwrites existing command: "kubectl version"
error: one plugin warning was found
has:kubectl-version overwrites existing command: "kubectl version"
Successful
message:The following compatible plugins are available:

test/fixtures/pkg/kubectl/plugins/kubectl-foo
test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo
  - warning: test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo is overshadowed by a similarly named plugin: test/fixtures/pkg/kubectl/plugins/kubectl-foo
error: one plugin warning was found
has:test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo is overshadowed by a similarly named plugin
Successful
message:The following compatible plugins are available:

test/fixtures/pkg/kubectl/plugins/kubectl-foo
has:plugins are available
Successful
message:Unable read directory "test/fixtures/pkg/kubectl/plugins/empty" from your PATH: open test/fixtures/pkg/kubectl/plugins/empty: no such file or directory. Skipping...
error: unable to find any kubectl plugins in your PATH
has:unable to find any kubectl plugins in your PATH
Successful
message:I am plugin foo
has:plugin foo
Successful
message:I am plugin bar called with args test/fixtures/pkg/kubectl/plugins/bar/kubectl-bar arg1
... skipping 10 lines ...

+++ Running case: test-cmd.run_impersonation_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_impersonation_tests
+++ [1017 10:47:04] Testing impersonation
Successful
message:error: requesting groups or user-extra for  without impersonating a user
has:without impersonating a user
Warning: certificates.k8s.io/v1beta1 CertificateSigningRequest is deprecated in v1.19+, unavailable in v1.22+; use certificates.k8s.io/v1 CertificateSigningRequest
certificatesigningrequest.certificates.k8s.io/foo created
authorization.sh:68: Successful get csr/foo {{.spec.username}}: user1
(Bauthorization.sh:69: Successful get csr/foo {{range .spec.groups}}{{.}}{{end}}: system:authenticated
(BWarning: certificates.k8s.io/v1beta1 CertificateSigningRequest is deprecated in v1.19+, unavailable in v1.22+; use certificates.k8s.io/v1 CertificateSigningRequest
... skipping 19 lines ...
deployment.apps/test-1 created
I1017 10:47:05.859415   57975 event.go:291] "Event occurred" object="namespace-1602931625-18406/test-1-7487ff9cbb" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: test-1-7487ff9cbb-9sk8v"
I1017 10:47:05.942130   57975 event.go:291] "Event occurred" object="namespace-1602931625-18406/test-2" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set test-2-646997777c to 1"
deployment.apps/test-2 created
I1017 10:47:05.951037   57975 event.go:291] "Event occurred" object="namespace-1602931625-18406/test-2-646997777c" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: test-2-646997777c-477d7"
wait.sh:36: Successful get deployments {{range .items}}{{.metadata.name}},{{end}}: test-1,test-2,
(BE1017 10:47:07.975969   57975 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "test-1" deleted
deployment.apps "test-2" deleted
Successful
message:deployment.apps/test-1 condition met
deployment.apps/test-2 condition met
has:test-1 condition met
... skipping 23 lines ...
I1017 10:47:08.387693   54413 secure_serving.go:241] Stopped listening on 127.0.0.1:6443
I1017 10:47:08.387697   54413 nonstructuralschema_controller.go:204] Shutting down NonStructuralSchemaConditionController
I1017 10:47:08.387708   54413 naming_controller.go:302] Shutting down NamingConditionController
I1017 10:47:08.387724   54413 crd_finalizer.go:278] Shutting down CRDFinalizer
I1017 10:47:08.387665   54413 apiservice_controller.go:131] Shutting down APIServiceRegistrationController
I1017 10:47:08.387690   54413 apiapproval_controller.go:198] Shutting down KubernetesAPIApprovalPolicyConformantConditionController
W1017 10:47:08.388425   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.388729   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.388770   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.388425   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.388899   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.389028   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.388732   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.389064   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I1017 10:47:08.389074   54413 available_controller.go:469] Shutting down AvailableConditionController
W1017 10:47:08.389122   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.389175   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.389226   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.389258   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.389278   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.389372   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.389375   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.389404   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.389437   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.389444   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.389467   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.389468   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.389482   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.389490   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.389521   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.389540   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.389562   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.389605   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.389632   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I1017 10:47:08.389659   54413 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W1017 10:47:08.389717   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.389734   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.389768   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.389762   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.389832   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.389899   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.389934   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.389947   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.389994   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.390010   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.390030   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.390080   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.390084   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.390121   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.390175   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.390218   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.390241   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.390244   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.390263   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.390296   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.390299   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.390321   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.390336   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.390368   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.390423   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.390521   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.390548   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.390559   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.390604   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.390610   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.390619   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.390642   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.390755   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.390806   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.390819   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.390821   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.390859   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.390826   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.390945   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.390985   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.391002   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.391007   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.391037   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.391043   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:08.391087   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
junit report dir: /logs/artifacts
+++ [1017 10:47:08] Clean up complete
+ make test-integration
W1017 10:47:09.389163   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.389187   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.389214   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.389162   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.389227   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.389162   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.389325   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.389402   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.389444   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.389451   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.389451   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.389460   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.389507   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.389577   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.389605   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.389617   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.389640   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.389656   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.389662   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.389668   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.389670   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.389684   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.389695   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.389720   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.389745   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.389799   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.389799   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.389859   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.389917   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.389921   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.390002   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.390158   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.390157   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.390187   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.390215   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.390227   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.390243   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.390259   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.390270   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.390293   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.390324   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.390330   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.390345   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.390373   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.390384   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.390464   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.390475   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.390483   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.390487   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.390533   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.390544   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.390588   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.390663   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.390721   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.390723   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.390743   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.390762   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.390951   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.390991   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.390996   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.391001   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.391010   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.391022   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.391083   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.391097   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.391123   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.391148   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.391151   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.391167   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.391338   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.391352   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:09.391688   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:10.687579   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:10.688047   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:10.689717   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:10.724368   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:10.735838   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:10.763219   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:10.769884   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:10.775939   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:10.779485   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:10.785724   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:10.789067   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:10.796908   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:10.812820   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:10.834459   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:10.854038   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:10.862584   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:10.878924   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:10.880813   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:10.883468   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:10.886202   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:10.888763   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:10.888924   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:10.934963   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:10.936488   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:10.938539   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:10.944888   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:10.957807   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:10.961091   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:10.975048   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.007114   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.018303   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.026844   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.034634   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.052027   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.056248   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.056572   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.064504   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.069908   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.085193   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.092317   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.098572   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.098718   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.111554   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.120002   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.136607   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.185537   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.187823   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.187961   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.199373   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.202406   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.206631   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.210031   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.221130   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.228343   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.230530   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.236353   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.239607   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.241463   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.259512   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.266924   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.272892   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.272892   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.274255   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.276690   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.277655   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.287008   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.289337   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.290695   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.292874   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.299902   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.302265   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:11.302788   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:12.831984   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:12.938753   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:12.983016   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:13.012253   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:13.022965   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:13.061664   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:13.087429   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:13.112401   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:13.174588   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:13.178229   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:13.212305   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:13.213836   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:13.227316   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:13.278654   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
+++ [1017 10:47:13] Checking etcd is on PATH
/home/prow/go/src/k8s.io/kubernetes/third_party/etcd/etcd
+++ [1017 10:47:13] Starting etcd instance
W1017 10:47:13.303502   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:13.330312   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:13.341837   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1017 10:47:13.343695   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
etcd --advertise-client-urls http://127.0.0.1:2379 --data-dir /tmp/tmp.5UMiTWGyTd --listen-client-urls http://127.0.0.1:2379 --log-level=debug > "/logs/artifacts/etcd.ceb4ffc8-1063-11eb-9643-1efb4cfeb756.root.log.DEBUG.20201017-104713.93051" 2>/dev/null
Waiting for etcd to come up.
W1017 10:47:13.358176   54413 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
+++ [1017 10:47:13] On try 2, etcd: : {"health":"true"}
E1017 10:47:13.912026   54413 controller.go:184] StorageError: key not found, Code: 1, Key: /registry/masterleases/10.33.178.11, ResourceVersion: 0, AdditionalErrorMsg: 
{"header":{"cluster_id":"14841639068965178418","member_id":"10276657743932975437","revision":"2","raft_term":"2"}}+++ [1017 10:47:13] Running integration test cases
+++ [1017 10:47:18] Running tests without code coverage
{"Time":"2020-10-17T10:49:43.005699984Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/podlogs","Output":"ok  \tk8s.io/kubernetes/test/integration/apiserver/podlogs\t7.827s\n"}
{"Time":"2020-10-17T10:49:53.826147822Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/flowcontrol","Output":"ok  \tk8s.io/kubernetes/test/integration/apiserver/flowcontrol\t19.447s\n"}
{"Time":"2020-10-17T10:50:01.149264889Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestSelfSubjectAccessReview","Output":"meout.go:228 +0xb2\\nnet/http.Error(0x7f1c4849c520, 0xc005b01ee0, 0xc00938cf00, 0x60, 0x1f4)\\n\\t/usr/local/go/src/net/http/server.go:2054 +0x1f6\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.InternalError(0x7f1c4849c520, 0xc005b01ee0, 0xc0093dac00, 0x5238280, 0xc0093d1c00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/errors.go:75 +0x11e\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithAuthorization.func1(0x7f1c4849c520, 0xc005b01ee0, 0xc0093dac00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authorization.go:69 +0x497\\nnet/http.HandlerFunc.ServeHTTP(0xc005b32340, 0x7f1c4849c520, 0xc005b01ee0, 0xc0093dac00)\\n\\t/usr/local/go/src/net/http/server.go:2042 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithMaxInFlightLimit.func2(0x7f1c4849c520, 0xc005b01ee0, 0xc0093dac"}
{"Time":"2020-10-17T10:50:01.149274643Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestSelfSubjectAccessReview","Output":"00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/maxinflight.go:175 +0x4cf\\nnet/http.HandlerFunc.ServeHTTP(0xc005b2b2c0, 0x7f1c4849c520, 0xc005b01ee0, 0xc0093dac00)\\n\\t/usr/local/go/src/net/http/server.go:2042 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithImpersonation.func1(0x7f1c4849c520, 0xc005b01ee0, 0xc0093dac00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/impersonation.go:50 +0x2306\\nnet/http.HandlerFunc.ServeHTTP(0xc005b32380, 0x7f1c4849c520, 0xc005b01ee0, 0xc0093dac00)\\n\\t/usr/local/go/src/net/http/server.go:2042 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithAuthentication.func1(0x7f1c4849c520, 0xc005b01ee0, 0xc0093dab00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authentication.go:70 +0x672\\nnet/http.HandlerFun"}
{"Time":"2020-10-17T10:50:03.056328328Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver","Test":"TestListOptions/watchCacheEnabled=true/limit=0_continue=empty_rv=invalid_rvMatch=","Output":"r/pkg/server/filters/timeout.go:228 +0xb2\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.(*ResponseWriterDelegator).WriteHeader(0xc019a4ec90, 0x1f4)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:529 +0x45\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.(*deferredResponseWriter).Write(0xc00c6d2190, 0xc018b78000, 0xc0, 0x8727e, 0x0, 0x0, 0xc01ed33d10)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:204 +0x1f7\\nencoding/json.(*Encoder).Encode(0xc01c2fa178, 0x48e3000, 0xc01e216640, 0x0, 0x41147b)\\n\\t/usr/local/go/src/encoding/json/stream.go:231 +0x1cb\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).doEncode(0xc000185090, 0x51d04a0, 0xc01e216640, 0x51bf220, 0xc00c6d2190, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/g"}
{"Time":"2020-10-17T10:50:03.056337437Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver","Test":"TestListOptions/watchCacheEnabled=true/limit=0_continue=empty_rv=invalid_rvMatch=","Output":"o/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:326 +0x2e9\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).Encode(0xc000185090, 0x51d04a0, 0xc01e216640, 0x51bf220, 0xc00c6d2190, 0x3ac8e3c, 0x6)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:300 +0x169\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).doEncode(0xc01e2166e0, 0x51d04a0, 0xc01e216640, 0x51bf220, 0xc00c6d2190, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:228 +0x396\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).Encode(0xc01e2166e0, 0x51d04a0, 0xc01e216640, 0x51bf220, 0xc00c6d2190, 0x52237c0, 0xc000185090)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/"}
{"Time":"2020-10-17T10:50:03.05634572Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver","Test":"TestListOptions/watchCacheEnabled=true/limit=0_continue=empty_rv=invalid_rvMatch=","Output":"vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:184 +0x170\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.SerializeObject(0x49a687a, 0x10, 0x7ff423757cf8, 0xc01e2166e0, 0x5224080, 0xc01d6f0a28, 0xc01e22e900, 0x1f4, 0x51d04a0, 0xc01e216640)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:96 +0x12c\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.WriteObjectNegotiated(0x5227500, 0xc0114dadc0, 0x5227840, 0x74215f0, 0x4989a8c, 0x4, 0x4988338, 0x2, 0x5224080, 0xc01d6f0a28, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:253 +0x572\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.ErrorNegotiated(0x51c3580, 0xc0218dd960, 0x5227500, 0xc0114dadc0, 0x4989a8c, 0x4, 0x4988338, 0x2, 0x52"}
{"Time":"2020-10-17T10:50:03.056370862Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver","Test":"TestListOptions/watchCacheEnabled=true/limit=0_continue=empty_rv=invalid_rvMatch=","Output":"1(0xc019a4ebd0, 0xc005b14070)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:407 +0x2d5\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc0119b1cb0, 0x7ff423760cc0, 0xc01d6f0a18, 0xc01e22e900)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:288 +0xa84\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).Dispatch(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:199\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.director.ServeHTTP(0x499ff80, 0xe, 0xc0119b1cb0, 0xc0124ef500, 0x7ff423760cc0, 0xc01d6f0a18, 0xc01e22e900)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/handler.go:146 +0x539\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/f"}
{"Time":"2020-10-17T10:50:03.056390348Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver","Test":"TestListOptions/watchCacheEnabled=true/limit=0_continue=empty_rv=invalid_rvMatch=","Output":"kg/endpoints/filters/impersonation.go:50 +0x2306\\nnet/http.HandlerFunc.ServeHTTP(0xc00a66db00, 0x7ff423760cc0, 0xc01d6f0a18, 0xc01e22e900)\\n\\t/usr/local/go/src/net/http/server.go:2042 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithAuthentication.func1(0x7ff423760cc0, 0xc01d6f0a18, 0xc01e22e800)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authentication.go:70 +0x672\\nnet/http.HandlerFunc.ServeHTTP(0xc00b8554a0, 0x7ff423760cc0, 0xc01d6f0a18, 0xc01e22e800)\\n\\t/usr/local/go/src/net/http/server.go:2042 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1(0xc01d693f80, 0xc002ef8da0, 0x522aac0, 0xc01d6f0a18, 0xc01e22e800)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:113 +0xb8\\ncreated by k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP\\n"}
... skipping 5 lines ...
{"Time":"2020-10-17T10:50:10.245438764Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestAuthModeAlwaysAllow","Output":"netes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.(*ResponseWriterDelegator).WriteHeader(0xc00d906db0, 0x1f7)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:529 +0x45\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.(*deferredResponseWriter).Write(0xc00c7ed630, 0xc00c0dc600, 0xa3, 0x581, 0x0, 0x0, 0x8)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:204 +0x1f7\\nencoding/json.(*Encoder).Encode(0xc00c0ac180, 0x4949900, 0xc00c9bcd20, 0x0, 0x41147b)\\n\\t/usr/local/go/src/encoding/json/stream.go:231 +0x1cb\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).doEncode(0xc00013a780, 0x5245540, 0xc00c9bcd20, 0x52342c0, 0xc00c7ed630, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/ser"}
{"Time":"2020-10-17T10:50:10.24544875Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestAuthModeAlwaysAllow","Output":"ializer/json/json.go:326 +0x2e9\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).Encode(0xc00013a780, 0x5245540, 0xc00c9bcd20, 0x52342c0, 0xc00c7ed630, 0x3b170c4, 0x6)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:300 +0x169\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).doEncode(0xc00c9bcdc0, 0x5245540, 0xc00c9bcd20, 0x52342c0, 0xc00c7ed630, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:228 +0x396\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).Encode(0xc00c9bcdc0, 0x5245540, 0xc00c9bcd20, 0x52342c0, 0xc00c7ed630, 0x52999e0, 0xc00013a780)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versi"}
{"Time":"2020-10-17T10:50:10.245456905Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestAuthModeAlwaysAllow","Output":"oning.go:184 +0x170\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.SerializeObject(0x4a0e676, 0x10, 0x7f1c486a8078, 0xc00c9bcdc0, 0x529a2a0, 0xc005a6efa8, 0xc00d902a00, 0x1f7, 0x5245540, 0xc00c9bcd20)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:96 +0x12c\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.WriteObjectNegotiated(0x529d7a0, 0xc00cb36c00, 0x529dae0, 0x7569290, 0x0, 0x0, 0x49ef1c6, 0x2, 0x529a2a0, 0xc005a6efa8, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:253 +0x572\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.ErrorNegotiated(0x5233ae0, 0xc00c9bcc80, 0x529d7a0, 0xc00cb36c00, 0x0, 0x0, 0x49ef1c6, 0x2, 0x529a2a0, 0xc005a6efa8, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local"}
{"Time":"2020-10-17T10:50:10.245466807Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestAuthModeAlwaysAllow","Output":"/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:272 +0x16f\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.(*RequestScope).err(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:89\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.ConnectResource.func1.1()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:188 +0x259\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.RecordLongRunning(0xc00d902a00, 0xc00d912370, 0x49f98f1, 0x9, 0xc00ee26f78)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:350 +0x293\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.ConnectResource.func1(0x529a2a0, 0xc005a6efa8, 0xc00d902a00)\\n\\t/home/prow/go/src/k8s.io/kubernet"}
{"Time":"2020-10-17T10:50:10.245475077Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestAuthModeAlwaysAllow","Output":"es/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:185 +0x472\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints.restfulConnectResource.func1(0xc00d906d20, 0xc006ad2770)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/installer.go:1215 +0x99\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.InstrumentRouteFunc.func1(0xc00d906d20, 0xc006ad2770)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:407 +0x2d5\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc00cb3aa20, 0x7f1c4849c520, 0xc005a6ef98, 0xc00d902a00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:288 +0xa84\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).Dispatch(...)\\n\\t/home/prow/go/src/k8s.io/kubernet"}
{"Time":"2020-10-17T10:50:10.245495879Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestAuthModeAlwaysAllow","Output":"etes/vendor/k8s.io/apiserver/pkg/server/filters/maxinflight.go:175 +0x4cf\\nnet/http.HandlerFunc.ServeHTTP(0xc00cb336e0, 0x7f1c4849c520, 0xc005a6ef98, 0xc00d902a00)\\n\\t/usr/local/go/src/net/http/server.go:2042 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithImpersonation.func1(0x7f1c4849c520, 0xc005a6ef98, 0xc00d902a00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/impersonation.go:50 +0x2306\\nnet/http.HandlerFunc.ServeHTTP(0xc00cb36600, 0x7f1c4849c520, 0xc005a6ef98, 0xc00d902a00)\\n\\t/usr/local/go/src/net/http/server.go:2042 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithAuthentication.func1(0x7f1c4849c520, 0xc005a6ef98, 0xc00d902900)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authentication.go:70 +0x672\\nnet/http.HandlerFunc.ServeHTTP(0xc00c87ff40, 0x7f1c4849c520, 0xc005a6ef98, 0xc00d902900)\\n\\t/usr"}
{"Time":"2020-10-17T10:50:10.245508218Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestAuthModeAlwaysAllow","Output":"/local/go/src/net/http/server.go:2042 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1(0xc00c66db60, 0xc00cb35360, 0x52a0de0, 0xc005a6ef98, 0xc00d902900)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:113 +0xb8\\ncreated by k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:99 +0x1cc\\n\" addedInfo=\"\\nlogging error output: \\\"{\\\\\\\"kind\\\\\\\":\\\\\\\"Status\\\\\\\",\\\\\\\"apiVersion\\\\\\\":\\\\\\\"v1\\\\\\\",\\\\\\\"metadata\\\\\\\":{},\\\\\\\"status\\\\\\\":\\\\\\\"Failure\\\\\\\",\\\\\\\"message\\\\\\\":\\\\\\\"no endpoints available for service \\\\\\\\\\\\\\\"a\\\\\\\\\\\\\\\"\\\\\\\",\\\\\\\"reason\\\\\\\":\\\\\\\"ServiceUnavailable\\\\\\\",\\\\\\\"code\\\\\\\":503}\\\\n\\\"\\n\"\n"}
{"Time":"2020-10-17T10:50:11.695537637Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apimachinery","Output":"ok  \tk8s.io/kubernetes/test/integration/apimachinery\t51.571s\n"}
{"Time":"2020-10-17T10:50:18.789533697Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestAliceNotForbiddenOrUnauthorized","Output":"io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.(*ResponseWriterDelegator).WriteHeader(0xc00523a360, 0x1f7)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:529 +0x45\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.(*deferredResponseWriter).Write(0xc00f9a8910, 0xc00c0dc600, 0xa3, 0x581, 0x0, 0x0, 0x8)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:204 +0x1f7\\nencoding/json.(*Encoder).Encode(0xc01af22180, 0x4949900, 0xc01a9cdb80, 0x0, 0x41147b)\\n\\t/usr/local/go/src/encoding/json/stream.go:231 +0x1cb\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).doEncode(0xc00013a780, 0x5245540, 0xc01a9cdb80, 0x52342c0, 0xc00f9a8910, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/run"}
{"Time":"2020-10-17T10:50:18.789543012Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestAliceNotForbiddenOrUnauthorized","Output":"time/serializer/json/json.go:326 +0x2e9\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).Encode(0xc00013a780, 0x5245540, 0xc01a9cdb80, 0x52342c0, 0xc00f9a8910, 0x3b170c4, 0x6)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:300 +0x169\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).doEncode(0xc01a9cdc20, 0x5245540, 0xc01a9cdb80, 0x52342c0, 0xc00f9a8910, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:228 +0x396\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).Encode(0xc01a9cdc20, 0x5245540, 0xc01a9cdb80, 0x52342c0, 0xc00f9a8910, 0x52999e0, 0xc00013a780)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioni"}
{"Time":"2020-10-17T10:50:18.789551922Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestAliceNotForbiddenOrUnauthorized","Output":"ng/versioning.go:184 +0x170\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.SerializeObject(0x4a0e676, 0x10, 0x7f1c486a8078, 0xc01a9cdc20, 0x529a2a0, 0xc019fd0318, 0xc003c99400, 0x1f7, 0x5245540, 0xc01a9cdb80)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:96 +0x12c\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.WriteObjectNegotiated(0x529d7a0, 0xc015a9ed00, 0x529dae0, 0x7569290, 0x0, 0x0, 0x49ef1c6, 0x2, 0x529a2a0, 0xc019fd0318, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:253 +0x572\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.ErrorNegotiated(0x5233ae0, 0xc01a9cdae0, 0x529d7a0, 0xc015a9ed00, 0x0, 0x0, 0x49ef1c6, 0x2, 0x529a2a0, 0xc019fd0318, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_outp"}
{"Time":"2020-10-17T10:50:18.789560825Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestAliceNotForbiddenOrUnauthorized","Output":"ut/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:272 +0x16f\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.(*RequestScope).err(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:89\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.ConnectResource.func1.1()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:188 +0x259\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.RecordLongRunning(0xc003c99400, 0xc01af81550, 0x49f98f1, 0x9, 0xc018994f78)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:350 +0x293\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.ConnectResource.func1(0x529a2a0, 0xc019fd0318, 0xc003c99400)\\n\\t/home/prow/go/src/k8s.io/"}
{"Time":"2020-10-17T10:50:18.789569549Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestAliceNotForbiddenOrUnauthorized","Output":"kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:185 +0x472\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints.restfulConnectResource.func1(0xc00523a2d0, 0xc01a96f810)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/installer.go:1215 +0x99\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.InstrumentRouteFunc.func1(0xc00523a2d0, 0xc01a96f810)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:407 +0x2d5\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc015aa6630, 0x7f1c4849c520, 0xc019fd0308, 0xc003c99400)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:288 +0xa84\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).Dispatch(...)\\n\\t/home/prow/go/src/k8s.io/"}
{"Time":"2020-10-17T10:50:18.789596281Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestAliceNotForbiddenOrUnauthorized","Output":"o/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/maxinflight.go:175 +0x4cf\\nnet/http.HandlerFunc.ServeHTTP(0xc015aa83f0, 0x7f1c4849c520, 0xc019fd0308, 0xc003c99400)\\n\\t/usr/local/go/src/net/http/server.go:2042 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithImpersonation.func1(0x7f1c4849c520, 0xc019fd0308, 0xc003c99400)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/impersonation.go:50 +0x2306\\nnet/http.HandlerFunc.ServeHTTP(0xc015a9e700, 0x7f1c4849c520, 0xc019fd0308, 0xc003c99400)\\n\\t/usr/local/go/src/net/http/server.go:2042 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithAuthentication.func1(0x7f1c4849c520, 0xc019fd0308, 0xc003c99300)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authentication.go:70 +0x672\\nnet/http.HandlerFunc.ServeHTTP(0xc015a86910, 0x7f1c4849c520, 0xc019fd0308, 0xc003c99300)"}
{"Time":"2020-10-17T10:50:18.789608275Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestAliceNotForbiddenOrUnauthorized","Output":"\\n\\t/usr/local/go/src/net/http/server.go:2042 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1(0xc00fb2c540, 0xc015aa4720, 0x52a0de0, 0xc019fd0308, 0xc003c99300)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:113 +0xb8\\ncreated by k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:99 +0x1cc\\n\" addedInfo=\"\\nlogging error output: \\\"{\\\\\\\"kind\\\\\\\":\\\\\\\"Status\\\\\\\",\\\\\\\"apiVersion\\\\\\\":\\\\\\\"v1\\\\\\\",\\\\\\\"metadata\\\\\\\":{},\\\\\\\"status\\\\\\\":\\\\\\\"Failure\\\\\\\",\\\\\\\"message\\\\\\\":\\\\\\\"no endpoints available for service \\\\\\\\\\\\\\\"a\\\\\\\\\\\\\\\"\\\\\\\",\\\\\\\"reason\\\\\\\":\\\\\\\"ServiceUnavailable\\\\\\\",\\\\\\\"code\\\\\\\":503}\\\\n\\\"\\n\"\n"}
{"Time":"2020-10-17T10:50:29.492085092Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/configmap","Output":"ok  \tk8s.io/kubernetes/test/integration/configmap\t4.240s\n"}
{"Time":"2020-10-17T10:50:31.198450072Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestImpersonateIsForbidden","Output":"\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.(*ResponseWriterDelegator).WriteHeader(0xc025668c60, 0x1f4)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:529 +0x45\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.(*deferredResponseWriter).Write(0xc0256319a0, 0xc0049ce000, 0xbb, 0xb05, 0x0, 0x0, 0x42663e0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:204 +0x1f7\\nencoding/json.(*Encoder).Encode(0xc024b9df00, 0x4949900, 0xc0256721e0, 0x0, 0x41147b)\\n\\t/usr/local/go/src/encoding/json/stream.go:231 +0x1cb\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).doEncode(0xc00013a780, 0x5245540, 0xc0256721e0, 0x52342c0, 0xc0256319a0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachi"}
{"Time":"2020-10-17T10:50:31.198458736Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestImpersonateIsForbidden","Output":"nery/pkg/runtime/serializer/json/json.go:326 +0x2e9\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).Encode(0xc00013a780, 0x5245540, 0xc0256721e0, 0x52342c0, 0xc0256319a0, 0x3b170c4, 0x6)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:300 +0x169\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).doEncode(0xc025672280, 0x5245540, 0xc0256721e0, 0x52342c0, 0xc0256319a0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:228 +0x396\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).Encode(0xc025672280, 0x5245540, 0xc0256721e0, 0x52342c0, 0xc0256319a0, 0x52999e0, 0xc00013a780)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/seriali"}
{"Time":"2020-10-17T10:50:31.198476392Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestImpersonateIsForbidden","Output":"zer/versioning/versioning.go:184 +0x170\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.SerializeObject(0x4a0e676, 0x10, 0x7f1c486a8078, 0xc025672280, 0x529a2a0, 0xc024d19320, 0xc02566d000, 0x1f4, 0x5245540, 0xc0256721e0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:96 +0x12c\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.WriteObjectNegotiated(0x529d7a0, 0xc0204db900, 0x529dae0, 0x7569290, 0x0, 0x0, 0x49ef1c6, 0x2, 0x529a2a0, 0xc024d19320, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:253 +0x572\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.ErrorNegotiated(0x522dbe0, 0xc025662ae0, 0x529d7a0, 0xc0204db900, 0x0, 0x0, 0x49ef1c6, 0x2, 0x529a2a0, 0xc024d19320, ...)\\n\\t/home/prow/go/src/k8s.io/kube"}
{"Time":"2020-10-17T10:50:31.198486591Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestImpersonateIsForbidden","Output":"rnetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:272 +0x16f\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.(*RequestScope).err(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:89\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.DeleteResource.func1(0x529a2a0, 0xc024d19320, 0xc02566d000)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/delete.go:95 +0x1a10\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints.restfulDeleteResource.func1(0xc025668bd0, 0xc0237c3420)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/installer.go:1179 +0x83\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.InstrumentRouteFunc.func1(0xc025668bd0, 0xc0237c3420)\\n\\t/home/prow/go/src/k"}
{"Time":"2020-10-17T10:50:31.19849474Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestImpersonateIsForbidden","Output":"8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:407 +0x2d5\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc0204e9dd0, 0x7f1c4849c520, 0xc024d19310, 0xc02566d000)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:288 +0xa84\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).Dispatch(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:199\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.director.ServeHTTP(0x4a07bdc, 0xe, 0xc0204e9dd0, 0xc0192e65b0, 0x7f1c4849c520, 0xc024d19310, 0xc02566d000)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/handler.go:146 +0x539\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithAuthorization.func1(0x7f1c4849c520, 0xc02"}
... skipping 298 lines ...
I1017 10:58:13.769204  124702 tlsconfig.go:255] Shutting down DynamicServingCertificateController
I1017 10:58:13.769257  124702 secure_serving.go:241] Stopped listening on 127.0.0.1:39727
I1017 10:58:13.769315  124702 crd_finalizer.go:278] Shutting down CRDFinalizer
    --- SKIP: TestLimits/ApplyPatchType_nested_patch_under_limit (0.00s)


=== Failed
=== FAIL: test/integration/apiserver/apply TestApplyCRDNonStructuralSchema (11.42s)
E1017 10:49:59.827160  116821 controller.go:184] Get "https://127.0.0.1:41061/api/v1/namespaces/default/endpoints/kubernetes": dial tcp 127.0.0.1:41061: connect: connection refused
    testserver.go:312: Resolved testserver package path to: "/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/cmd/kube-apiserver/app/testing"
I1017 10:50:01.271263  116821 serving.go:325] Generated self-signed cert (/tmp/kubernetes-kube-apiserver405427394/apiserver.crt, /tmp/kubernetes-kube-apiserver405427394/apiserver.key)
I1017 10:50:01.271375  116821 server.go:625] external host was not specified, using 127.0.0.1
W1017 10:50:01.271422  116821 authentication.go:504] AnonymousAuth is not allowed with the AlwaysAllow authorizer. Resetting AnonymousAuth to false. You should use a different authorizer
    testserver.go:183: runtime-config=map[api/all:true]
... skipping 20 lines ...
I1017 10:50:02.256898  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.256929  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
W1017 10:50:02.296983  116821 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
I1017 10:50:02.298440  116821 instance.go:271] Using reconciler: lease
I1017 10:50:02.298781  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.298814  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.303507  116821 instance.go:376] Could not construct pre-rendered responses for ServiceAccountIssuerDiscovery endpoints. Endpoints will not be enabled. Error: empty issuer URL
I1017 10:50:02.303943  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.303979  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.305215  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.305251  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I1017 10:50:02.307156  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:02.307191  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
... skipping 199 lines ...
I1017 10:50:09.410210  116821 storage_scheduling.go:143] all system priority classes are created successfully or already exist.
W1017 10:50:09.503770  116821 lease.go:233] Resetting endpoints for master service "kubernetes" to [127.0.0.1]
E1017 10:50:09.505951  116821 controller.go:223] unable to sync kubernetes service: Endpoints "kubernetes" is invalid: subsets[0].addresses[0].ip: Invalid value: "127.0.0.1": may not be in the loopback range (127.0.0.0/8)
W1017 10:50:09.581128  116821 warnings.go:70] apiextensions.k8s.io/v1beta1 CustomResourceDefinition is deprecated in v1.16+, unavailable in v1.22+; use apiextensions.k8s.io/v1 CustomResourceDefinition
I1017 10:50:10.124938  116821 client.go:360] parsed scheme: "endpoint"
I1017 10:50:10.124983  116821 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
W1017 10:50:10.226090  116821 watcher.go:220] watch chan error: etcdserver: mvcc: required revision has been compacted
W1017 10:50:11.227266  116821 cacher.go:148] Terminating all watchers from cacher *unstructured.Unstructured
    apply_crd_test.go:525: expected DELETE, but got watch.Event{Type:"", Object:runtime.Object(nil)}
W1017 10:50:11.227927  116821 cacher.go:148] Terminating all watchers from cacher *apiextensions.CustomResourceDefinition
W1017 10:50:11.228438  116821 cacher.go:148] Terminating all watchers from cacher *core.LimitRange
W1017 10:50:11.228657  116821 cacher.go:148] Terminating all watchers from cacher *core.ResourceQuota
W1017 10:50:11.228854  116821 cacher.go:148] Terminating all watchers from cacher *core.Secret
... skipping 11 lines ...
W1017 10:50:11.239086  116821 cacher.go:148] Terminating all watchers from cacher *admissionregistration.MutatingWebhookConfiguration
W1017 10:50:11.239726  116821 cacher.go:148] Terminating all watchers from cacher *apiregistration.APIService


DONE 2862 tests, 25 skipped, 1 failure in 5.676s
+++ [1017 11:00:36] Saved JUnit XML test report to /logs/artifacts/junit_20201017-104718.xml
make[1]: *** [Makefile:185: test] Error 1
!!! [1017 11:00:36] Call tree:
!!! [1017 11:00:36]  1: hack/make-rules/test-integration.sh:97 runTests(...)
+++ [1017 11:00:36] Cleaning up etcd
+++ [1017 11:00:36] Integration test cleanup complete
make: *** [Makefile:204: test-integration] Error 1
+ EXIT_VALUE=2
+ set +o xtrace
Cleaning up after docker in docker.
================================================================================
Cleaning up after docker
Stopping Docker: dockerProgram process in pidfile '/var/run/docker-ssd.pid', 1 process(es), refused to die.
... skipping 3 lines ...