This job view page is being replaced by Spyglass soon. Check out the new job view.
PRcofyc: integration: Wait for PV to be available in TestVolumeProvision
ResultFAILURE
Tests 1 failed / 2946 succeeded
Started2020-08-01 05:49
Elapsed58m31s
Revision930409c9933ce1af18dbc5c4b011d7ff5f29b00d
Refs 93555
links{u'resultstore': {u'url': u'https://source.cloud.google.com/results/invocations/3f711e17-3a90-4f19-8349-7f4bf7d9fa91/targets/test'}}
resultstorehttps://source.cloud.google.com/results/invocations/3f711e17-3a90-4f19-8349-7f4bf7d9fa91/targets/test

Test Failures


k8s.io/kubernetes/test/integration/master TestMain 0.00s

go test -v k8s.io/kubernetes/test/integration/master -run TestMain$
I0801 06:37:39.197202  117526 etcd.go:81] etcd already running at http://127.0.0.1:2379
FAIL	k8s.io/kubernetes/test/integration/master	611.101s

				from junit_20200801-062817.xml

Filter through log files | View test history on testgrid


Show 2946 Passed Tests

Show 24 Skipped Tests

Error lines from build-log.txt

... skipping 61 lines ...
Recording: record_command_canary
Running command: record_command_canary

+++ Running case: test-cmd.record_command_canary 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: record_command_canary
/home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh: line 155: bogus-expected-to-fail: command not found
!!! [0801 06:01:22] Call tree:
!!! [0801 06:01:22]  1: /home/prow/go/src/k8s.io/kubernetes/test/cmd/../../third_party/forked/shell2junit/sh2ju.sh:47 record_command_canary(...)
!!! [0801 06:01:22]  2: /home/prow/go/src/k8s.io/kubernetes/test/cmd/../../third_party/forked/shell2junit/sh2ju.sh:112 eVal(...)
!!! [0801 06:01:22]  3: /home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh:131 juLog(...)
!!! [0801 06:01:22]  4: /home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh:159 record_command(...)
!!! [0801 06:01:22]  5: hack/make-rules/test-cmd.sh:35 source(...)
+++ exit code: 1
+++ error: 1
+++ [0801 06:01:22] Running kubeadm tests
+++ [0801 06:01:36] Building go targets for linux/amd64:
    cmd/kubeadm
+++ [0801 06:03:05] Running tests without code coverage
{"Time":"2020-08-01T06:06:08.53101416Z","Action":"output","Package":"k8s.io/kubernetes/cmd/kubeadm/test/cmd","Output":"ok  \tk8s.io/kubernetes/cmd/kubeadm/test/cmd\t112.762s\n"}
✓  cmd/kubeadm/test/cmd (1m52.772s)
... skipping 323 lines ...
I0801 06:10:13.657574   54838 client.go:360] parsed scheme: "passthrough"
I0801 06:10:13.657641   54838 passthrough.go:48] ccResolverWrapper: sending update to cc: {[{http://127.0.0.1:2379  <nil> 0 <nil>}] <nil> <nil>}
I0801 06:10:13.657657   54838 clientconn.go:948] ClientConn switching balancer to "pick_first"
+++ [0801 06:10:43] Starting controller-manager
Flag --port has been deprecated, see --secure-port instead.
I0801 06:10:44.301147   58383 serving.go:331] Generated self-signed cert in-memory
W0801 06:10:45.722761   58383 authentication.go:368] failed to read in-cluster kubeconfig for delegated authentication: open /var/run/secrets/kubernetes.io/serviceaccount/token: no such file or directory
W0801 06:10:45.722812   58383 authentication.go:265] No authentication-kubeconfig provided in order to lookup client-ca-file in configmap/extension-apiserver-authentication in kube-system, so client certificate authentication won't work.
W0801 06:10:45.722822   58383 authentication.go:289] No authentication-kubeconfig provided in order to lookup requestheader-client-ca-file in configmap/extension-apiserver-authentication in kube-system, so request-header client certificate authentication won't work.
W0801 06:10:45.722843   58383 authorization.go:177] failed to read in-cluster kubeconfig for delegated authorization: open /var/run/secrets/kubernetes.io/serviceaccount/token: no such file or directory
W0801 06:10:45.722868   58383 authorization.go:146] No authorization-kubeconfig provided, so SubjectAccessReview of authorization tokens won't work.
I0801 06:10:45.722895   58383 controllermanager.go:175] Version: v1.20.0-alpha.0.450+2a565fdf30426e
I0801 06:10:45.724804   58383 secure_serving.go:197] Serving securely on [::]:10257
I0801 06:10:45.724886   58383 tlsconfig.go:240] Starting DynamicServingCertificateController
I0801 06:10:45.725669   58383 deprecated_insecure_serving.go:53] Serving insecurely on [::]:10252
I0801 06:10:45.725758   58383 leaderelection.go:243] attempting to acquire leader lease  kube-system/kube-controller-manager...
... skipping 7 lines ...
W0801 06:10:46.219834   58383 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
I0801 06:10:46.219849   58383 controllermanager.go:549] Started "endpoint"
W0801 06:10:46.220225   58383 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0801 06:10:46.220257   58383 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
I0801 06:10:46.220300   58383 controllermanager.go:549] Started "deployment"
W0801 06:10:46.222367   58383 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
E0801 06:10:46.222490   58383 core.go:90] Failed to start service controller: WARNING: no cloud provider provided, services of type LoadBalancer will fail
W0801 06:10:46.222504   58383 controllermanager.go:541] Skipping "service"
W0801 06:10:46.223016   58383 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
I0801 06:10:46.223070   58383 controllermanager.go:549] Started "endpointslicemirroring"
W0801 06:10:46.223438   58383 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0801 06:10:46.223503   58383 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
I0801 06:10:46.223565   58383 controllermanager.go:549] Started "daemonset"
... skipping 108 lines ...
I0801 06:10:46.635540   58383 controllermanager.go:549] Started "garbagecollector"
I0801 06:10:46.636232   58383 controllermanager.go:549] Started "persistentvolume-binder"
W0801 06:10:46.636251   58383 controllermanager.go:541] Skipping "root-ca-cert-publisher"
I0801 06:10:46.638997   58383 controllermanager.go:549] Started "cronjob"
W0801 06:10:46.639023   58383 controllermanager.go:528] "bootstrapsigner" is disabled
I0801 06:10:46.639298   58383 node_lifecycle_controller.go:77] Sending events to api server
E0801 06:10:46.639332   58383 core.go:230] failed to start cloud node lifecycle controller: no cloud provider provided
W0801 06:10:46.639341   58383 controllermanager.go:541] Skipping "cloud-node-lifecycle"
W0801 06:10:46.639618   58383 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
I0801 06:10:46.639718   58383 controllermanager.go:549] Started "clusterrole-aggregation"
W0801 06:10:46.639734   58383 controllermanager.go:541] Skipping "ttl-after-finished"
W0801 06:10:46.639744   58383 controllermanager.go:541] Skipping "csrsigning"
W0801 06:10:46.639750   58383 controllermanager.go:541] Skipping "nodeipam"
... skipping 39 lines ...
I0801 06:10:46.884494   58383 shared_informer.go:247] Caches are synced for attach detach 
I0801 06:10:46.884513   58383 shared_informer.go:247] Caches are synced for disruption 
I0801 06:10:46.884523   58383 disruption.go:339] Sending events to api server.
I0801 06:10:46.884638   58383 shared_informer.go:247] Caches are synced for ReplicaSet 
I0801 06:10:46.884684   58383 shared_informer.go:247] Caches are synced for stateful set 
I0801 06:10:46.885892   58383 shared_informer.go:247] Caches are synced for PVC protection 
E0801 06:10:46.928456   58383 clusterroleaggregation_controller.go:181] view failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "view": the object has been modified; please apply your changes to the latest version and try again
E0801 06:10:46.931832   58383 clusterroleaggregation_controller.go:181] edit failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "edit": the object has been modified; please apply your changes to the latest version and try again
I0801 06:10:46.941716   58383 shared_informer.go:247] Caches are synced for namespace 
I0801 06:10:46.941978   58383 shared_informer.go:247] Caches are synced for service account 
I0801 06:10:46.944236   54838 controller.go:606] quota admission added evaluator for: serviceaccounts
I0801 06:10:47.084164   58383 shared_informer.go:247] Caches are synced for resource quota 
W0801 06:10:47.348724   58383 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0801 06:10:47.348804   58383 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
... skipping 3 lines ...
W0801 06:10:47.349507   58383 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0801 06:10:47.350047   58383 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0801 06:10:47.350194   58383 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0801 06:10:47.350320   58383 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
I0801 06:10:47.350767   58383 shared_informer.go:240] Waiting for caches to sync for garbage collector
node/127.0.0.1 created
W0801 06:10:47.432229   58383 actual_state_of_world.go:506] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="127.0.0.1" does not exist
I0801 06:10:47.441564   58383 shared_informer.go:247] Caches are synced for garbage collector 
I0801 06:10:47.441605   58383 garbagecollector.go:137] Garbage collector: all resource monitors have synced. Proceeding to collect garbage
I0801 06:10:47.453759   58383 shared_informer.go:247] Caches are synced for garbage collector 
I0801 06:10:47.485513   54838 client.go:360] parsed scheme: "passthrough"
I0801 06:10:47.485582   54838 passthrough.go:48] ccResolverWrapper: sending update to cc: {[{http://127.0.0.1:2379  <nil> 0 <nil>}] <nil> <nil>}
I0801 06:10:47.485596   54838 clientconn.go:948] ClientConn switching balancer to "pick_first"
... skipping 113 lines ...
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_RESTMapper_evaluation_tests
+++ [0801 06:11:01] Creating namespace namespace-1596262261-26127
namespace/namespace-1596262261-26127 created
Context "test" modified.
+++ [0801 06:11:02] Testing RESTMapper
+++ [0801 06:11:02] "kubectl get unknownresourcetype" returns error as expected: error: the server doesn't have a resource type "unknownresourcetype"
+++ exit code: 0
NAME                              SHORTNAMES   APIGROUP                       NAMESPACED   KIND
bindings                                                                      true         Binding
componentstatuses                 cs                                          false        ComponentStatus
configmaps                        cm                                          true         ConfigMap
endpoints                         ep                                          true         Endpoints
... skipping 59 lines ...
namespace/namespace-1596262277-13144 created
Context "test" modified.
+++ [0801 06:11:17] Testing clusterroles
rbac.sh:29: Successful get clusterroles/cluster-admin {{.metadata.name}}: cluster-admin
(Brbac.sh:30: Successful get clusterrolebindings/cluster-admin {{.metadata.name}}: cluster-admin
(BSuccessful
message:Error from server (NotFound): clusterroles.rbac.authorization.k8s.io "pod-admin" not found
has:clusterroles.rbac.authorization.k8s.io "pod-admin" not found
clusterrole.rbac.authorization.k8s.io/pod-admin created (dry run)
clusterrole.rbac.authorization.k8s.io/pod-admin created (server dry run)
Successful
message:Error from server (NotFound): clusterroles.rbac.authorization.k8s.io "pod-admin" not found
has:clusterroles.rbac.authorization.k8s.io "pod-admin" not found
clusterrole.rbac.authorization.k8s.io/pod-admin created
rbac.sh:42: Successful get clusterrole/pod-admin {{range.rules}}{{range.verbs}}{{.}}:{{end}}{{end}}: *:
(BSuccessful
message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
clusterrole.rbac.authorization.k8s.io "pod-admin" deleted
... skipping 21 lines ...
(Bclusterrole.rbac.authorization.k8s.io/url-reader created
rbac.sh:61: Successful get clusterrole/url-reader {{range.rules}}{{range.verbs}}{{.}}:{{end}}{{end}}: get:
(Brbac.sh:62: Successful get clusterrole/url-reader {{range.rules}}{{range.nonResourceURLs}}{{.}}:{{end}}{{end}}: /logs/*:/healthz/*:
(Bclusterrole.rbac.authorization.k8s.io/aggregation-reader created
rbac.sh:64: Successful get clusterrole/aggregation-reader {{.metadata.name}}: aggregation-reader
(BSuccessful
message:Error from server (NotFound): clusterrolebindings.rbac.authorization.k8s.io "super-admin" not found
has:clusterrolebindings.rbac.authorization.k8s.io "super-admin" not found
clusterrolebinding.rbac.authorization.k8s.io/super-admin created (dry run)
clusterrolebinding.rbac.authorization.k8s.io/super-admin created (server dry run)
Successful
message:Error from server (NotFound): clusterrolebindings.rbac.authorization.k8s.io "super-admin" not found
has:clusterrolebindings.rbac.authorization.k8s.io "super-admin" not found
clusterrolebinding.rbac.authorization.k8s.io/super-admin created
rbac.sh:77: Successful get clusterrolebinding/super-admin {{range.subjects}}{{.name}}:{{end}}: super-admin:
(Bclusterrolebinding.rbac.authorization.k8s.io/super-admin subjects updated (dry run)
clusterrolebinding.rbac.authorization.k8s.io/super-admin subjects updated (server dry run)
rbac.sh:80: Successful get clusterrolebinding/super-admin {{range.subjects}}{{.name}}:{{end}}: super-admin:
... skipping 59 lines ...
rbac.sh:102: Successful get clusterrolebinding/super-admin {{range.subjects}}{{.name}}:{{end}}: super-admin:foo:test-all-user:
(Brbac.sh:103: Successful get clusterrolebinding/super-group {{range.subjects}}{{.name}}:{{end}}: the-group:foo:test-all-user:
(Brbac.sh:104: Successful get clusterrolebinding/super-sa {{range.subjects}}{{.name}}:{{end}}: sa-name:foo:test-all-user:
(Brolebinding.rbac.authorization.k8s.io/admin created (dry run)
rolebinding.rbac.authorization.k8s.io/admin created (server dry run)
Successful
message:Error from server (NotFound): rolebindings.rbac.authorization.k8s.io "admin" not found
has: not found
rolebinding.rbac.authorization.k8s.io/admin created
rbac.sh:113: Successful get rolebinding/admin {{.roleRef.kind}}: ClusterRole
(Brbac.sh:114: Successful get rolebinding/admin {{range.subjects}}{{.name}}:{{end}}: default-admin:
(Brolebinding.rbac.authorization.k8s.io/admin subjects updated
rbac.sh:116: Successful get rolebinding/admin {{range.subjects}}{{.name}}:{{end}}: default-admin:foo:
... skipping 29 lines ...
message:Warning: rbac.authorization.k8s.io/v1beta1 Role is deprecated in v1.17+, unavailable in v1.22+; use rbac.authorization.k8s.io/v1 Role
No resources found in namespace-1596262304-28392 namespace.
has:Role is deprecated
Successful
message:Warning: rbac.authorization.k8s.io/v1beta1 Role is deprecated in v1.17+, unavailable in v1.22+; use rbac.authorization.k8s.io/v1 Role
No resources found in namespace-1596262304-28392 namespace.
Error: 1 warning received
has:Role is deprecated
Successful
message:Warning: rbac.authorization.k8s.io/v1beta1 Role is deprecated in v1.17+, unavailable in v1.22+; use rbac.authorization.k8s.io/v1 Role
No resources found in namespace-1596262304-28392 namespace.
Error: 1 warning received
has:Error: 1 warning received
role.rbac.authorization.k8s.io/pod-admin created (dry run)
role.rbac.authorization.k8s.io/pod-admin created (server dry run)
Successful
message:Error from server (NotFound): roles.rbac.authorization.k8s.io "pod-admin" not found
has: not found
role.rbac.authorization.k8s.io/pod-admin created
rbac.sh:163: Successful get role/pod-admin {{range.rules}}{{range.verbs}}{{.}}:{{end}}{{end}}: *:
(Brbac.sh:164: Successful get role/pod-admin {{range.rules}}{{range.resources}}{{.}}:{{end}}{{end}}: pods:
(Brbac.sh:165: Successful get role/pod-admin {{range.rules}}{{range.apiGroups}}{{.}}:{{end}}{{end}}: :
(BSuccessful
... skipping 464 lines ...
has:valid-pod
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          1s
has:valid-pod
core.sh:190: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Berror: resource(s) were provided, but no name, label selector, or --all flag specified
core.sh:194: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bcore.sh:198: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Berror: setting 'all' parameter but found a non empty selector. 
core.sh:202: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bcore.sh:206: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "valid-pod" force deleted
core.sh:210: Successful get pods -l'name in (valid-pod)' {{range.items}}{{.metadata.name}}:{{end}}: 
(Bcore.sh:215: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-kubectl-describe-pod\" }}found{{end}}{{end}}:: :
... skipping 22 lines ...
(Bpoddisruptionbudget.policy/test-pdb-2 created
core.sh:259: Successful get pdb/test-pdb-2 --namespace=test-kubectl-describe-pod {{.spec.minAvailable}}: 50%
(Bpoddisruptionbudget.policy/test-pdb-3 created
core.sh:265: Successful get pdb/test-pdb-3 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 2
(Bpoddisruptionbudget.policy/test-pdb-4 created
core.sh:269: Successful get pdb/test-pdb-4 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 50%
(Berror: min-available and max-unavailable cannot be both specified
core.sh:275: Successful get pods --namespace=test-kubectl-describe-pod {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/env-test-pod created
matched TEST_CMD_1
matched <set to the key 'key-1' in secret 'test-secret'>
matched TEST_CMD_2
matched <set to the key 'key-2' of config map 'test-configmap'>
... skipping 230 lines ...
core.sh:534: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:3.2:
(BSuccessful
message:kubectl-create kubectl-patch
has:kubectl-patch
pod/valid-pod patched
core.sh:554: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
(B+++ [0801 06:14:01] "kubectl patch with resourceVersion 716" returns error as expected: Error from server (Conflict): Operation cannot be fulfilled on pods "valid-pod": the object has been modified; please apply your changes to the latest version and try again
pod "valid-pod" deleted
pod/valid-pod replaced
core.sh:578: Successful get pod valid-pod {{(index .spec.containers 0).name}}: replaced-k8s-serve-hostname
(BSuccessful
message:kubectl-create kubectl-patch kubectl-replace
has:kubectl-replace
Successful
message:error: --grace-period must have --force specified
has:\-\-grace-period must have \-\-force specified
Successful
message:error: --timeout must have --force specified
has:\-\-timeout must have \-\-force specified
W0801 06:14:05.266037   58383 actual_state_of_world.go:506] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="node-v1-test" does not exist
node/node-v1-test created
core.sh:606: Successful get node node-v1-test {{range.items}}{{if .metadata.annotations.a}}found{{end}}{{end}}:: :
(Bnode/node-v1-test replaced (server dry run)
I0801 06:14:06.857985   58383 event.go:291] "Event occurred" object="node-v1-test" kind="Node" apiVersion="v1" type="Normal" reason="RegisteredNode" message="Node node-v1-test event: Registered Node node-v1-test in Controller"
node/node-v1-test replaced (dry run)
core.sh:631: Successful get node node-v1-test {{range.items}}{{if .metadata.annotations.a}}found{{end}}{{end}}:: :
... skipping 32 lines ...
spec:
  containers:
  - image: k8s.gcr.io/pause:2.0
    name: kubernetes-pause
has:localonlyvalue
core.sh:683: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
(Berror: 'name' already has a value (valid-pod), and --overwrite is false
core.sh:687: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
(Bcore.sh:691: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
(Bpod/valid-pod labeled
core.sh:695: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod-super-sayan
(Bcore.sh:699: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
... skipping 86 lines ...
+++ Running case: test-cmd.run_kubectl_create_error_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_kubectl_create_error_tests
+++ [0801 06:14:45] Creating namespace namespace-1596262485-20167
namespace/namespace-1596262485-20167 created
Context "test" modified.
+++ [0801 06:14:45] Testing kubectl create with error
Error: must specify one of -f and -k

Create a resource from a file or from stdin.

 JSON and YAML formats are accepted.

Examples:
... skipping 42 lines ...

Usage:
  kubectl create -f FILENAME [options]

Use "kubectl <command> --help" for more information about a given command.
Use "kubectl options" for a list of global command-line options (applies to all commands).
+++ [0801 06:14:46] "kubectl create with empty string list returns error as expected: error: error validating "hack/testdata/invalid-rc-with-empty-args.yaml": error validating data: ValidationError(ReplicationController.spec.template.spec.containers[0].args): unknown object type "nil" in ReplicationController.spec.template.spec.containers[0].args[0]; if you choose to ignore these errors, turn validation off with --validate=false
kubectl convert is DEPRECATED and will be removed in a future version.
In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
+++ exit code: 0
Recording: run_kubectl_apply_tests
Running command: run_kubectl_apply_tests

... skipping 36 lines ...
deployment.apps "test-deployment-retainkeys" deleted
Waiting for Get pods {{range.items}}{{.metadata.name}}:{{end}} : expected: , got: test-deployment-retainkeys-8695b756f8-qp2w2:
apply.sh:88: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/selector-test-pod created
apply.sh:92: Successful get pods selector-test-pod {{.metadata.labels.name}}: selector-test-pod
(BSuccessful
message:Error from server (NotFound): pods "selector-test-pod-dont-apply" not found
has:pods "selector-test-pod-dont-apply" not found
pod "selector-test-pod" deleted
apply.sh:101: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BW0801 06:15:22.049301   66610 helpers.go:567] --dry-run=true is deprecated (boolean value) and can be replaced with --dry-run=client.
pod/test-pod created (dry run)
pod/test-pod created (dry run)
... skipping 11 lines ...
Warning: apiextensions.k8s.io/v1beta1 CustomResourceDefinition is deprecated in v1.16+, unavailable in v1.22+; use apiextensions.k8s.io/v1 CustomResourceDefinition
Warning: apiextensions.k8s.io/v1beta1 CustomResourceDefinition is deprecated in v1.16+, unavailable in v1.22+; use apiextensions.k8s.io/v1 CustomResourceDefinition
I0801 06:15:38.847872   54838 client.go:360] parsed scheme: "endpoint"
I0801 06:15:38.847921   54838 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0801 06:15:38.859212   54838 controller.go:606] quota admission added evaluator for: resources.mygroup.example.com
kind.mygroup.example.com/myobj created (server dry run)
Error from server (NotFound): resources.mygroup.example.com "myobj" not found
customresourcedefinition.apiextensions.k8s.io "resources.mygroup.example.com" deleted
namespace/nsb created
apply.sh:158: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BI0801 06:15:40.432642   54838 ???:1] sending watch cancel request for closed watcher{watch-id 11 0  <nil>}
W0801 06:15:40.432719   54838 ???:1] failed to send watch cancel request{watch-id 11 0  <nil>} {error 25 0  EOF}
pod/a created
apply.sh:161: Successful get pods a -n nsb {{.metadata.name}}: a
(Bpod/b created
pod/a pruned
Warning: extensions/v1beta1 Ingress is deprecated in v1.14+, unavailable in v1.22+; use networking.k8s.io/v1 Ingress
apply.sh:165: Successful get pods b -n nsb {{.metadata.name}}: b
(BSuccessful
message:Error from server (NotFound): pods "a" not found
has:pods "a" not found
pod "b" deleted
apply.sh:175: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/a created
apply.sh:180: Successful get pods a {{.metadata.name}}: a
(BSuccessful
message:Error from server (NotFound): pods "b" not found
has:pods "b" not found
pod/b created
apply.sh:188: Successful get pods a {{.metadata.name}}: a
(Bapply.sh:189: Successful get pods b -n nsb {{.metadata.name}}: b
(Bpod "a" deleted
pod "b" deleted
Successful
message:error: all resources selected for prune without explicitly passing --all. To prune all resources, pass the --all flag. If you did not mean to prune all resources, specify a label selector
has:all resources selected for prune without explicitly passing --all
pod/a created
pod/b created
service/prune-svc created
Warning: extensions/v1beta1 Ingress is deprecated in v1.14+, unavailable in v1.22+; use networking.k8s.io/v1 Ingress
I0801 06:15:51.058413   54838 client.go:360] parsed scheme: "passthrough"
... skipping 43 lines ...
(Bpod/b created
apply.sh:242: Successful get pods b -n nsb {{.metadata.name}}: b
(Bpod/b unchanged
pod/a pruned
Warning: extensions/v1beta1 Ingress is deprecated in v1.14+, unavailable in v1.22+; use networking.k8s.io/v1 Ingress
Successful
message:Error from server (NotFound): pods "a" not found
has:pods "a" not found
apply.sh:249: Successful get pods b -n nsb {{.metadata.name}}: b
(Bnamespace "nsb" deleted
Successful
message:error: the namespace from the provided object "nsb" does not match the namespace "foo". You must pass '--namespace=nsb' to perform this operation.
has:the namespace from the provided object "nsb" does not match the namespace "foo".
apply.sh:260: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(Bservice/a created
apply.sh:264: Successful get services a {{.metadata.name}}: a
(BSuccessful
message:The Service "a" is invalid: spec.clusterIP: Invalid value: "10.0.0.12": field is immutable
... skipping 29 lines ...
(Bapply.sh:286: Successful get deployment test-the-deployment {{.metadata.name}}: test-the-deployment
(Bapply.sh:287: Successful get service test-the-service {{.metadata.name}}: test-the-service
(Bconfigmap "test-the-map" deleted
service "test-the-service" deleted
deployment.apps "test-the-deployment" deleted
Successful
message:Error from server (NotFound): namespaces "multi-resource-ns" not found
has:namespaces "multi-resource-ns" not found
apply.sh:295: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:namespace/multi-resource-ns created
Error from server (NotFound): error when creating "hack/testdata/multi-resource-1.yaml": namespaces "multi-resource-ns" not found
has:namespaces "multi-resource-ns" not found
Successful
message:Error from server (NotFound): pods "test-pod" not found
has:pods "test-pod" not found
pod/test-pod created
namespace/multi-resource-ns unchanged
apply.sh:303: Successful get pods test-pod -n multi-resource-ns {{.metadata.name}}: test-pod
(Bpod "test-pod" deleted
namespace "multi-resource-ns" deleted
apply.sh:309: Successful get configmaps {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:configmap/foo created
error: unable to recognize "hack/testdata/multi-resource-2.yaml": no matches for kind "Bogus" in version "example.com/v1"
has:no matches for kind "Bogus" in version "example.com/v1"
apply.sh:315: Successful get configmaps foo {{.metadata.name}}: foo
(Bconfigmap "foo" deleted
apply.sh:321: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:pod/pod-a created
... skipping 7 lines ...
I0801 06:16:45.949216   58383 namespace_controller.go:185] Namespace has been deleted multi-resource-ns
apply.sh:329: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapply.sh:333: Successful get crds {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:Warning: apiextensions.k8s.io/v1beta1 CustomResourceDefinition is deprecated in v1.16+, unavailable in v1.22+; use apiextensions.k8s.io/v1 CustomResourceDefinition
customresourcedefinition.apiextensions.k8s.io/widgets.example.com created
error: unable to recognize "hack/testdata/multi-resource-4.yaml": no matches for kind "Widget" in version "example.com/v1"
has:no matches for kind "Widget" in version "example.com/v1"
I0801 06:16:48.926593   54838 client.go:360] parsed scheme: "endpoint"
I0801 06:16:48.926656   54838 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
Successful
message:Error from server (NotFound): widgets.example.com "foo" not found
has:widgets.example.com "foo" not found
apply.sh:339: Successful get crds widgets.example.com {{.metadata.name}}: widgets.example.com
(BI0801 06:16:50.598952   54838 controller.go:606] quota admission added evaluator for: widgets.example.com
widget.example.com/foo created
Warning: apiextensions.k8s.io/v1beta1 CustomResourceDefinition is deprecated in v1.16+, unavailable in v1.22+; use apiextensions.k8s.io/v1 CustomResourceDefinition
customresourcedefinition.apiextensions.k8s.io/widgets.example.com unchanged
apply.sh:342: Successful get widget foo {{.metadata.name}}: foo
(Bwidget.example.com "foo" deleted
Warning: apiextensions.k8s.io/v1beta1 CustomResourceDefinition is deprecated in v1.16+, unavailable in v1.22+; use apiextensions.k8s.io/v1 CustomResourceDefinition
customresourcedefinition.apiextensions.k8s.io "widgets.example.com" deleted
+++ exit code: 0
W0801 06:16:52.142865   58383 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
E0801 06:16:52.146793   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0801 06:16:52.147700   58383 shared_informer.go:240] Waiting for caches to sync for garbage collector
Recording: run_kubectl_server_side_apply_tests
Running command: run_kubectl_server_side_apply_tests

+++ Running case: test-cmd.run_kubectl_server_side_apply_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_kubectl_server_side_apply_tests
+++ [0801 06:16:52] Creating namespace namespace-1596262612-14126
namespace/namespace-1596262612-14126 created
I0801 06:16:52.829806   54838 ???:1] sending watch cancel request for closed watcher{watch-id 11 0  <nil>}
W0801 06:16:52.829875   54838 ???:1] failed to send watch cancel request{watch-id 11 0  <nil>} {error 25 0  EOF}
Context "test" modified.
+++ [0801 06:16:53] Testing kubectl apply --server-side
E0801 06:16:53.495603   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apply.sh:359: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/test-pod serverside-applied
apply.sh:363: Successful get pods test-pod {{.metadata.labels.name}}: test-pod-label
(BE0801 06:16:56.019631   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:kubectl
has:kubectl
pod/test-pod serverside-applied
Successful
message:kubectl my-field-manager
has:my-field-manager
pod "test-pod" deleted
apply.sh:376: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BI0801 06:17:00.435041   54838 client.go:360] parsed scheme: "passthrough"
I0801 06:17:00.435186   54838 passthrough.go:48] ccResolverWrapper: sending update to cc: {[{http://127.0.0.1:2379  <nil> 0 <nil>}] <nil> <nil>}
I0801 06:17:00.435229   54838 clientconn.go:948] ClientConn switching balancer to "pick_first"
pod/test-pod serverside-applied (server dry run)
E0801 06:17:00.741428   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apply.sh:381: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/test-pod serverside-applied
pod/test-pod serverside-applied (server dry run)
apply.sh:388: Successful get pods test-pod {{.metadata.labels.name}}: test-pod-label
(BSuccessful
message:1146
has:1146
pod "test-pod" deleted
apply.sh:398: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(B+++ [0801 06:17:08] Testing upgrade kubectl client-side apply to server-side apply
pod/test-pod created
error: Apply failed with 1 conflict: conflict with "kubectl-client-side-apply" using v1: .metadata.labels.name
Please review the fields above--they currently have other managers. Here
are the ways you can resolve this warning:
* If you intend to manage all of these fields, please re-run the apply
  command with the `--force-conflicts` flag.
* If you do not intend to manage all of the fields, please edit your
  manifest to remove references to the fields that should keep their
  current managers.
* You may co-own fields by updating your manifest to match the existing
  value; in this case, you'll become the manager if the other manager(s)
  stop managing the field (remove it from their configuration).
See http://k8s.io/docs/reference/using-api/api-concepts/#conflicts
E0801 06:17:12.789686   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/test-pod serverside-applied
Successful
message:{
  "apiVersion": "v1",
  "kind": "Pod",
  "metadata": {
... skipping 50 lines ...
I0801 06:17:23.061486   58383 shared_informer.go:240] Waiting for caches to sync for garbage collector
W0801 06:17:23.101069   58383 shared_informer.go:494] resyncPeriod 62681057954911 is smaller than resyncCheckPeriod 77002239520113 and the informer has already started. Changing it to 77002239520113
I0801 06:17:23.101139   58383 resource_quota_monitor.go:228] QuotaMonitor created object count evaluator for resources.mygroup.example.com
I0801 06:17:23.101202   58383 shared_informer.go:240] Waiting for caches to sync for resource quota
I0801 06:17:23.101231   58383 shared_informer.go:247] Caches are synced for resource quota 
I0801 06:17:23.161732   58383 shared_informer.go:247] Caches are synced for garbage collector 
Error from server (NotFound): resources.mygroup.example.com "myobj" not found
customresourcedefinition.apiextensions.k8s.io "resources.mygroup.example.com" deleted
+++ exit code: 0
Recording: run_kubectl_run_tests
Running command: run_kubectl_run_tests

+++ Running case: test-cmd.run_kubectl_run_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_kubectl_run_tests
+++ [0801 06:17:24] Creating namespace namespace-1596262644-22372
W0801 06:17:24.785655   54838 cacher.go:148] Terminating all watchers from cacher *unstructured.Unstructured
I0801 06:17:24.786345   54838 ???:1] sending watch cancel request for closed watcher{watch-id 11 0  <nil>}
W0801 06:17:24.786425   54838 ???:1] failed to send watch cancel request{watch-id 11 0  <nil>} {error 25 0  EOF}
E0801 06:17:24.788243   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1596262644-22372 created
Context "test" modified.
+++ [0801 06:17:25] Testing kubectl run
pod/nginx-extensions created (dry run)
E0801 06:17:26.370677   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/nginx-extensions created (server dry run)
run.sh:32: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0801 06:17:28.469846   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
run.sh:35: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/nginx-extensions created
run.sh:39: Successful get pod {{range.items}}{{.metadata.name}}:{{end}}: nginx-extensions:
(Bpod "nginx-extensions" deleted
Successful
message:pod/test1 created
has:pod/test1 created
pod "test1" deleted
Successful
message:error: Invalid image name "InvalidImageName": invalid reference format
has:error: Invalid image name "InvalidImageName": invalid reference format
+++ exit code: 0
Recording: run_kubectl_create_filter_tests
Running command: run_kubectl_create_filter_tests

+++ Running case: test-cmd.run_kubectl_create_filter_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
... skipping 2 lines ...
namespace/namespace-1596262650-9846 created
Context "test" modified.
+++ [0801 06:17:31] Testing kubectl create filter
create.sh:50: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/selector-test-pod created
create.sh:54: Successful get pods selector-test-pod {{.metadata.labels.name}}: selector-test-pod
(BE0801 06:17:34.205487   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:Error from server (NotFound): pods "selector-test-pod-dont-apply" not found
has:pods "selector-test-pod-dont-apply" not found
pod "selector-test-pod" deleted
+++ exit code: 0
Recording: run_kubectl_apply_deployments_tests
Running command: run_kubectl_apply_deployments_tests

+++ Running case: test-cmd.run_kubectl_apply_deployments_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_kubectl_apply_deployments_tests
+++ [0801 06:17:35] Creating namespace namespace-1596262655-24677
E0801 06:17:36.050036   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1596262655-24677 created
Context "test" modified.
+++ [0801 06:17:36] Testing kubectl apply deployments
apps.sh:119: Successful get deployments {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:120: Successful get replicasets {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:121: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
... skipping 8 lines ...
apps.sh:134: Successful get deployments my-depl {{.spec.template.metadata.labels.l1}}: l1
(Bapps.sh:135: Successful get deployments my-depl {{.spec.selector.matchLabels.l1}}: l1
(Bapps.sh:136: Successful get deployments my-depl {{.metadata.labels.l1}}: <no value>
(Bdeployment.apps "my-depl" deleted
replicaset.apps "my-depl-84fb47b469" deleted
pod "my-depl-84fb47b469-ztf24" deleted
E0801 06:17:43.186858   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:142: Successful get deployments {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:143: Successful get replicasets {{range.items}}{{.metadata.name}}:{{end}}: 
(BI0801 06:17:44.788475   54838 client.go:360] parsed scheme: "passthrough"
I0801 06:17:44.788682   54838 passthrough.go:48] ccResolverWrapper: sending update to cc: {[{http://127.0.0.1:2379  <nil> 0 <nil>}] <nil> <nil>}
I0801 06:17:44.788697   54838 clientconn.go:948] ClientConn switching balancer to "pick_first"
apps.sh:144: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
... skipping 2 lines ...
I0801 06:17:46.431183   58383 event.go:291] "Event occurred" object="namespace-1596262655-24677/nginx" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-9bb9c4878 to 3"
I0801 06:17:46.443333   58383 event.go:291] "Event occurred" object="namespace-1596262655-24677/nginx-9bb9c4878" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-9bb9c4878-nz5dv"
I0801 06:17:46.480385   58383 event.go:291] "Event occurred" object="namespace-1596262655-24677/nginx-9bb9c4878" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-9bb9c4878-s8wbz"
I0801 06:17:46.480666   58383 event.go:291] "Event occurred" object="namespace-1596262655-24677/nginx-9bb9c4878" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-9bb9c4878-shfl6"
apps.sh:152: Successful get deployment nginx {{.metadata.name}}: nginx
(BSuccessful
message:Error from server (Conflict): error when applying patch:
{"metadata":{"annotations":{"kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"Deployment\",\"metadata\":{\"annotations\":{},\"labels\":{\"name\":\"nginx\"},\"name\":\"nginx\",\"namespace\":\"namespace-1596262655-24677\",\"resourceVersion\":\"99\"},\"spec\":{\"replicas\":3,\"selector\":{\"matchLabels\":{\"name\":\"nginx2\"}},\"template\":{\"metadata\":{\"labels\":{\"name\":\"nginx2\"}},\"spec\":{\"containers\":[{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"nginx\",\"ports\":[{\"containerPort\":80}]}]}}}}\n"},"resourceVersion":"99"},"spec":{"selector":{"matchLabels":{"name":"nginx2"}},"template":{"metadata":{"labels":{"name":"nginx2"}}}}}
to:
Resource: "apps/v1, Resource=deployments", GroupVersionKind: "apps/v1, Kind=Deployment"
Name: "nginx", Namespace: "namespace-1596262655-24677"
for: "hack/testdata/deployment-label-change2.yaml": Operation cannot be fulfilled on deployments.apps "nginx": the object has been modified; please apply your changes to the latest version and try again
has:Error from server (Conflict)
I0801 06:17:53.453745   58383 shared_informer.go:240] Waiting for caches to sync for resource quota
I0801 06:17:53.453802   58383 shared_informer.go:247] Caches are synced for resource quota 
I0801 06:17:53.514094   58383 shared_informer.go:240] Waiting for caches to sync for garbage collector
I0801 06:17:53.514169   58383 shared_informer.go:247] Caches are synced for garbage collector 
E0801 06:17:57.443491   58383 replica_set.go:532] sync "namespace-1596262655-24677/nginx-9bb9c4878" failed with Operation cannot be fulfilled on replicasets.apps "nginx-9bb9c4878": StorageError: invalid object, Code: 4, Key: /registry/replicasets/namespace-1596262655-24677/nginx-9bb9c4878, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 957a1574-6fd7-415b-b9fb-29fcac97003c, UID in object meta: 
E0801 06:17:57.801942   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx configured
I0801 06:17:58.042016   58383 event.go:291] "Event occurred" object="namespace-1596262655-24677/nginx" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-6dd6cfdb57 to 3"
I0801 06:17:58.047682   58383 event.go:291] "Event occurred" object="namespace-1596262655-24677/nginx-6dd6cfdb57" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-6dd6cfdb57-s2tnh"
I0801 06:17:58.052741   58383 event.go:291] "Event occurred" object="namespace-1596262655-24677/nginx-6dd6cfdb57" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-6dd6cfdb57-d9xm7"
I0801 06:17:58.059896   58383 event.go:291] "Event occurred" object="namespace-1596262655-24677/nginx-6dd6cfdb57" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-6dd6cfdb57-wlcbj"
Successful
... skipping 223 lines ...
+status:
+  phase: Pending
+  qosClass: BestEffort
has:test-pod
diff.sh:76: Successful get pod {{range.items}}{{ if eq .metadata.name \"test-pod\" }}found{{end}}{{end}}:: :
(Bpod/test-pod serverside-applied
E0801 06:18:15.010505   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
diff.sh:80: Successful get pod {{range.items}}{{ if eq .metadata.name \"test-pod\" }}found{{end}}{{end}}:: found:
(BSuccessful
message:diff -u -N /tmp/LIVE-573865539/v1.Pod.namespace-1596262685-32007.test-pod /tmp/MERGED-147210438/v1.Pod.namespace-1596262685-32007.test-pod
--- /tmp/LIVE-573865539/v1.Pod.namespace-1596262685-32007.test-pod	2020-08-01 06:18:16.422860968 +0000
+++ /tmp/MERGED-147210438/v1.Pod.namespace-1596262685-32007.test-pod	2020-08-01 06:18:16.430861514 +0000
@@ -19,7 +19,7 @@
... skipping 88 lines ...
+++ [0801 06:18:20] Testing kubectl get
get.sh:29: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BI0801 06:18:21.413883   54838 client.go:360] parsed scheme: "passthrough"
I0801 06:18:21.413979   54838 passthrough.go:48] ccResolverWrapper: sending update to cc: {[{http://127.0.0.1:2379  <nil> 0 <nil>}] <nil> <nil>}
I0801 06:18:21.413995   54838 clientconn.go:948] ClientConn switching balancer to "pick_first"
Successful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
get.sh:37: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
get.sh:45: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:{
    "apiVersion": "v1",
    "items": [],
... skipping 23 lines ...
has not:No resources found
Successful
message:NAME
has not:No resources found
get.sh:73: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:error: the server doesn't have a resource type "foobar"
has not:No resources found
Successful
message:No resources found in namespace-1596262700-11057 namespace.
has:No resources found
Successful
message:
has not:No resources found
Successful
message:No resources found in namespace-1596262700-11057 namespace.
has:No resources found
get.sh:93: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
Successful
message:Error from server (NotFound): pods "abc" not found
has not:List
Successful
message:I0801 06:18:25.871504   70013 loader.go:375] Config loaded from file:  /tmp/tmp.t1QqiqJJWV/.kube/config
I0801 06:18:25.874405   70013 round_trippers.go:443] GET http://127.0.0.1:8080/version?timeout=32s 200 OK in 1 milliseconds
I0801 06:18:25.934802   70013 round_trippers.go:443] GET http://127.0.0.1:8080/api/v1/namespaces/default/pods 200 OK in 2 milliseconds
I0801 06:18:25.938214   70013 round_trippers.go:443] GET http://127.0.0.1:8080/api/v1/namespaces/default/replicationcontrollers 200 OK in 2 milliseconds
... skipping 637 lines ...
}
get.sh:158: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(B<no value>Successful
message:valid-pod:
has:valid-pod:
Successful
message:error: error executing jsonpath "{.missing}": Error executing template: missing is not found. Printing more information for debugging the template:
	template was:
		{.missing}
	object given to jsonpath engine was:
		map[string]interface {}{"apiVersion":"v1", "kind":"Pod", "metadata":map[string]interface {}{"creationTimestamp":"2020-08-01T06:18:37Z", "labels":map[string]interface {}{"name":"valid-pod"}, "managedFields":[]interface {}{map[string]interface {}{"apiVersion":"v1", "fieldsType":"FieldsV1", "fieldsV1":map[string]interface {}{"f:metadata":map[string]interface {}{"f:labels":map[string]interface {}{".":map[string]interface {}{}, "f:name":map[string]interface {}{}}}, "f:spec":map[string]interface {}{"f:containers":map[string]interface {}{"k:{\"name\":\"kubernetes-serve-hostname\"}":map[string]interface {}{".":map[string]interface {}{}, "f:image":map[string]interface {}{}, "f:imagePullPolicy":map[string]interface {}{}, "f:name":map[string]interface {}{}, "f:resources":map[string]interface {}{".":map[string]interface {}{}, "f:limits":map[string]interface {}{".":map[string]interface {}{}, "f:cpu":map[string]interface {}{}, "f:memory":map[string]interface {}{}}, "f:requests":map[string]interface {}{".":map[string]interface {}{}, "f:cpu":map[string]interface {}{}, "f:memory":map[string]interface {}{}}}, "f:terminationMessagePath":map[string]interface {}{}, "f:terminationMessagePolicy":map[string]interface {}{}}}, "f:dnsPolicy":map[string]interface {}{}, "f:enableServiceLinks":map[string]interface {}{}, "f:restartPolicy":map[string]interface {}{}, "f:schedulerName":map[string]interface {}{}, "f:securityContext":map[string]interface {}{}, "f:terminationGracePeriodSeconds":map[string]interface {}{}}}, "manager":"kubectl-create", "operation":"Update", "time":"2020-08-01T06:18:37Z"}}, "name":"valid-pod", "namespace":"namespace-1596262716-7769", "resourceVersion":"1403", "selfLink":"/api/v1/namespaces/namespace-1596262716-7769/pods/valid-pod", "uid":"ce013235-57a2-48f2-89f6-73dbdaba74af"}, "spec":map[string]interface {}{"containers":[]interface {}{map[string]interface {}{"image":"k8s.gcr.io/serve_hostname", "imagePullPolicy":"Always", "name":"kubernetes-serve-hostname", "resources":map[string]interface {}{"limits":map[string]interface {}{"cpu":"1", "memory":"512Mi"}, "requests":map[string]interface {}{"cpu":"1", "memory":"512Mi"}}, "terminationMessagePath":"/dev/termination-log", "terminationMessagePolicy":"File"}}, "dnsPolicy":"ClusterFirst", "enableServiceLinks":true, "preemptionPolicy":"PreemptLowerPriority", "priority":0, "restartPolicy":"Always", "schedulerName":"default-scheduler", "securityContext":map[string]interface {}{}, "terminationGracePeriodSeconds":30}, "status":map[string]interface {}{"phase":"Pending", "qosClass":"Guaranteed"}}
has:missing is not found
error: error executing template "{{.missing}}": template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing"
Successful
message:Error executing template: template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing". Printing more information for debugging the template:
	template was:
		{{.missing}}
	raw data was:
		{"apiVersion":"v1","kind":"Pod","metadata":{"creationTimestamp":"2020-08-01T06:18:37Z","labels":{"name":"valid-pod"},"managedFields":[{"apiVersion":"v1","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:labels":{".":{},"f:name":{}}},"f:spec":{"f:containers":{"k:{\"name\":\"kubernetes-serve-hostname\"}":{".":{},"f:image":{},"f:imagePullPolicy":{},"f:name":{},"f:resources":{".":{},"f:limits":{".":{},"f:cpu":{},"f:memory":{}},"f:requests":{".":{},"f:cpu":{},"f:memory":{}}},"f:terminationMessagePath":{},"f:terminationMessagePolicy":{}}},"f:dnsPolicy":{},"f:enableServiceLinks":{},"f:restartPolicy":{},"f:schedulerName":{},"f:securityContext":{},"f:terminationGracePeriodSeconds":{}}},"manager":"kubectl-create","operation":"Update","time":"2020-08-01T06:18:37Z"}],"name":"valid-pod","namespace":"namespace-1596262716-7769","resourceVersion":"1403","selfLink":"/api/v1/namespaces/namespace-1596262716-7769/pods/valid-pod","uid":"ce013235-57a2-48f2-89f6-73dbdaba74af"},"spec":{"containers":[{"image":"k8s.gcr.io/serve_hostname","imagePullPolicy":"Always","name":"kubernetes-serve-hostname","resources":{"limits":{"cpu":"1","memory":"512Mi"},"requests":{"cpu":"1","memory":"512Mi"}},"terminationMessagePath":"/dev/termination-log","terminationMessagePolicy":"File"}],"dnsPolicy":"ClusterFirst","enableServiceLinks":true,"preemptionPolicy":"PreemptLowerPriority","priority":0,"restartPolicy":"Always","schedulerName":"default-scheduler","securityContext":{},"terminationGracePeriodSeconds":30},"status":{"phase":"Pending","qosClass":"Guaranteed"}}
	object given to template engine was:
		map[apiVersion:v1 kind:Pod metadata:map[creationTimestamp:2020-08-01T06:18:37Z labels:map[name:valid-pod] managedFields:[map[apiVersion:v1 fieldsType:FieldsV1 fieldsV1:map[f:metadata:map[f:labels:map[.:map[] f:name:map[]]] f:spec:map[f:containers:map[k:{"name":"kubernetes-serve-hostname"}:map[.:map[] f:image:map[] f:imagePullPolicy:map[] f:name:map[] f:resources:map[.:map[] f:limits:map[.:map[] f:cpu:map[] f:memory:map[]] f:requests:map[.:map[] f:cpu:map[] f:memory:map[]]] f:terminationMessagePath:map[] f:terminationMessagePolicy:map[]]] f:dnsPolicy:map[] f:enableServiceLinks:map[] f:restartPolicy:map[] f:schedulerName:map[] f:securityContext:map[] f:terminationGracePeriodSeconds:map[]]] manager:kubectl-create operation:Update time:2020-08-01T06:18:37Z]] name:valid-pod namespace:namespace-1596262716-7769 resourceVersion:1403 selfLink:/api/v1/namespaces/namespace-1596262716-7769/pods/valid-pod uid:ce013235-57a2-48f2-89f6-73dbdaba74af] spec:map[containers:[map[image:k8s.gcr.io/serve_hostname imagePullPolicy:Always name:kubernetes-serve-hostname resources:map[limits:map[cpu:1 memory:512Mi] requests:map[cpu:1 memory:512Mi]] terminationMessagePath:/dev/termination-log terminationMessagePolicy:File]] dnsPolicy:ClusterFirst enableServiceLinks:true preemptionPolicy:PreemptLowerPriority priority:0 restartPolicy:Always schedulerName:default-scheduler securityContext:map[] terminationGracePeriodSeconds:30] status:map[phase:Pending qosClass:Guaranteed]]
... skipping 3 lines ...
valid-pod   0/1     Pending   0          1s
has:STATUS
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          1s
has:valid-pod
E0801 06:18:41.077329   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:pod/valid-pod
has not:STATUS
Successful
message:pod/valid-pod
has:pod/valid-pod
... skipping 143 lines ...
  terminationGracePeriodSeconds: 30
status:
  phase: Pending
  qosClass: Guaranteed
has:name: valid-pod
Successful
message:Error from server (NotFound): pods "invalid-pod" not found
has:"invalid-pod" not found
pod "valid-pod" deleted
get.sh:196: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/redis-master created
pod/valid-pod created
Successful
... skipping 22 lines ...
has:test-the-service
configmap "test-the-map" deleted
service "test-the-service" deleted
deployment.apps "test-the-deployment" deleted
get.sh:227: Successful get configmaps {{range.items}}{{.metadata.name}}:{{end}}: 
(Bget.sh:228: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0801 06:18:45.906716   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:229: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(B+++ exit code: 0
Recording: run_kubectl_exec_pod_tests
Running command: run_kubectl_exec_pod_tests

+++ Running case: test-cmd.run_kubectl_exec_pod_tests 
... skipping 2 lines ...
+++ [0801 06:18:46] Creating namespace namespace-1596262726-2958
namespace/namespace-1596262726-2958 created
Context "test" modified.
+++ [0801 06:18:46] Testing kubectl exec POD COMMAND
Successful
message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead.
Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
pod/test-pod created
Successful
message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead.
Error from server (BadRequest): pod test-pod does not have a host assigned
has not:pods "test-pod" not found
Successful
message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead.
Error from server (BadRequest): pod test-pod does not have a host assigned
has not:pod or type/name must be specified
pod "test-pod" deleted
+++ exit code: 0
Recording: run_kubectl_exec_resource_name_tests
Running command: run_kubectl_exec_resource_name_tests

... skipping 3 lines ...
+++ [0801 06:18:47] Creating namespace namespace-1596262727-10920
namespace/namespace-1596262727-10920 created
Context "test" modified.
+++ [0801 06:18:47] Testing kubectl exec TYPE/NAME COMMAND
Successful
message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead.
error: the server doesn't have a resource type "foo"
has:error:
Successful
message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead.
Error from server (NotFound): deployments.apps "bar" not found
has:"bar" not found
pod/test-pod created
replicaset.apps/frontend created
I0801 06:18:49.417144   58383 event.go:291] "Event occurred" object="namespace-1596262727-10920/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-bfl7z"
I0801 06:18:49.421063   58383 event.go:291] "Event occurred" object="namespace-1596262727-10920/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-ntjkf"
I0801 06:18:49.422254   58383 event.go:291] "Event occurred" object="namespace-1596262727-10920/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-dpngf"
configmap/test-set-env-config created
Successful
message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead.
error: cannot attach to *v1.ConfigMap: selector for *v1.ConfigMap not implemented
has:not implemented
Successful
message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead.
Error from server (BadRequest): pod test-pod does not have a host assigned
has not:not found
Successful
message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead.
Error from server (BadRequest): pod test-pod does not have a host assigned
has not:pod, type/name or --filename must be specified
Successful
message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead.
Error from server (BadRequest): pod frontend-bfl7z does not have a host assigned
has not:not found
Successful
message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead.
Error from server (BadRequest): pod frontend-bfl7z does not have a host assigned
has not:pod, type/name or --filename must be specified
pod "test-pod" deleted
replicaset.apps "frontend" deleted
configmap "test-set-env-config" deleted
+++ exit code: 0
Recording: run_create_secret_tests
Running command: run_create_secret_tests

+++ Running case: test-cmd.run_create_secret_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_create_secret_tests
Successful
message:Error from server (NotFound): secrets "mysecret" not found
has:secrets "mysecret" not found
Successful
message:user-specified
has:user-specified
Successful
message:Error from server (NotFound): secrets "mysecret" not found
has:secrets "mysecret" not found
Successful
{"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-update-cm","namespace":"default","selfLink":"/api/v1/namespaces/default/configmaps/tester-update-cm","uid":"4c0610d6-4f15-4247-92f0-fe4bd472f46f","resourceVersion":"1490","creationTimestamp":"2020-08-01T06:18:51Z"}}
Successful
message:{"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-update-cm","namespace":"default","selfLink":"/api/v1/namespaces/default/configmaps/tester-update-cm","uid":"4c0610d6-4f15-4247-92f0-fe4bd472f46f","resourceVersion":"1491","creationTimestamp":"2020-08-01T06:18:51Z"},"data":{"key1":"config1"}}
has:uid
Successful
message:{"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-update-cm","namespace":"default","selfLink":"/api/v1/namespaces/default/configmaps/tester-update-cm","uid":"4c0610d6-4f15-4247-92f0-fe4bd472f46f","resourceVersion":"1491","creationTimestamp":"2020-08-01T06:18:51Z"},"data":{"key1":"config1"}}
has:config1
{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Success","details":{"name":"tester-update-cm","kind":"configmaps","uid":"4c0610d6-4f15-4247-92f0-fe4bd472f46f"}}
Successful
message:Error from server (NotFound): configmaps "tester-update-cm" not found
has:configmaps "tester-update-cm" not found
+++ exit code: 0
Recording: run_kubectl_create_kustomization_directory_tests
Running command: run_kubectl_create_kustomization_directory_tests

+++ Running case: test-cmd.run_kubectl_create_kustomization_directory_tests 
... skipping 176 lines ...
has:Timeout
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          2s
has:valid-pod
Successful
message:error: Invalid timeout value. Timeout must be a single integer in seconds, or an integer followed by a corresponding time unit (e.g. 1s | 2m | 3h)
has:Invalid timeout value
pod "valid-pod" deleted
+++ exit code: 0
Recording: run_crd_tests
Running command: run_crd_tests

... skipping 244 lines ...
foo.company.com/test patched
crd.sh:236: Successful get foos/test {{.patched}}: value1
(Bfoo.company.com/test patched
crd.sh:238: Successful get foos/test {{.patched}}: value2
(Bfoo.company.com/test patched
crd.sh:240: Successful get foos/test {{.patched}}: <no value>
(B+++ [0801 06:19:16] "kubectl patch --local" returns error as expected for CustomResource: error: cannot apply strategic merge patch for company.com/v1, Kind=Foo locally, try --type merge
{
    "apiVersion": "company.com/v1",
    "kind": "Foo",
    "metadata": {
        "annotations": {
            "kubernetes.io/change-cause": "kubectl patch foos/test --server=http://127.0.0.1:8080 --match-server-version=true --patch={\"patched\":null} --type=merge --record=true"
... skipping 365 lines ...
(Bbar.company.com/test created
foo.company.com/test pruned
crd.sh:442: Successful get foos {{range.items}}{{.metadata.name}}:{{end}}: 
(Bcrd.sh:443: Successful get bars {{range.items}}{{.metadata.name}}:{{end}}: test:
(Bbar.company.com "test" deleted
crd.sh:449: Successful get foos {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0801 06:19:44.858411   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
crd.sh:450: Successful get bars {{range.items}}{{.metadata.name}}:{{end}}: 
(Bnamespace/non-native-resources created
bar.company.com/test created
crd.sh:455: Successful get bars {{len .items}}: 1
(Bnamespace "non-native-resources" deleted
crd.sh:458: Successful get bars {{len .items}}: 0
(BError from server (NotFound): namespaces "non-native-resources" not found
customresourcedefinition.apiextensions.k8s.io "foos.company.com" deleted
customresourcedefinition.apiextensions.k8s.io "bars.company.com" deleted
customresourcedefinition.apiextensions.k8s.io "resources.mygroup.example.com" deleted
customresourcedefinition.apiextensions.k8s.io "validfoos.company.com" deleted
+++ exit code: 0
W0801 06:19:53.173233   54838 cacher.go:148] Terminating all watchers from cacher *unstructured.Unstructured
I0801 06:19:53.173574   54838 ???:1] sending watch cancel request for closed watcher{watch-id 11 0  <nil>}
W0801 06:19:53.173612   54838 ???:1] failed to send watch cancel request{watch-id 11 0  <nil>} {error 25 0  EOF}
E0801 06:19:53.174742   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
+++ [0801 06:19:53] Testing recursive resources
+++ [0801 06:19:53] Creating namespace namespace-1596262793-31662
W0801 06:19:53.406950   54838 cacher.go:148] Terminating all watchers from cacher *unstructured.Unstructured
I0801 06:19:53.406957   54838 ???:1] sending watch cancel request for closed watcher{watch-id 11 0  <nil>}
W0801 06:19:53.407073   54838 ???:1] failed to send watch cancel request{watch-id 11 0  <nil>} {error 25 0  EOF}
E0801 06:19:53.408614   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1596262793-31662 created
W0801 06:19:53.685443   54838 cacher.go:148] Terminating all watchers from cacher *unstructured.Unstructured
I0801 06:19:53.685806   54838 ???:1] sending watch cancel request for closed watcher{watch-id 11 0  <nil>}
W0801 06:19:53.685860   54838 ???:1] failed to send watch cancel request{watch-id 11 0  <nil>} {error 25 0  EOF}
E0801 06:19:53.686959   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
W0801 06:19:53.967431   54838 cacher.go:148] Terminating all watchers from cacher *unstructured.Unstructured
I0801 06:19:53.967453   54838 ???:1] sending watch cancel request for closed watcher{watch-id 11 0  <nil>}
E0801 06:19:53.969343   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:202: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0801 06:19:54.349893   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0801 06:19:54.608822   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0801 06:19:54.802681   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:206: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:pod/busybox0 created
pod/busybox1 created
error: error validating "hack/testdata/recursive/pod/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
has:error validating data: kind not set
I0801 06:19:55.062057   58383 shared_informer.go:240] Waiting for caches to sync for resource quota
I0801 06:19:55.062108   58383 shared_informer.go:247] Caches are synced for resource quota 
generic-resources.sh:211: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BI0801 06:19:55.479968   58383 shared_informer.go:240] Waiting for caches to sync for garbage collector
I0801 06:19:55.480040   58383 shared_informer.go:247] Caches are synced for garbage collector 
generic-resources.sh:220: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: busybox:busybox:
(BSuccessful
message:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:227: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BI0801 06:19:56.398774   58383 namespace_controller.go:185] Namespace has been deleted non-native-resources
E0801 06:19:56.440895   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0801 06:19:56.702900   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0801 06:19:57.130563   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:231: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced:
(BSuccessful
message:pod/busybox0 replaced
pod/busybox1 replaced
error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
has:error validating data: kind not set
generic-resources.sh:236: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:Name:         busybox0
Namespace:    namespace-1596262793-31662
Priority:     0
Node:         <none>
... skipping 159 lines ...
has:Object 'Kind' is missing
generic-resources.sh:246: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:250: Successful get pods {{range.items}}{{.metadata.annotations.annotatekey}}:{{end}}: annotatevalue:annotatevalue:
(BSuccessful
message:pod/busybox0 annotated
pod/busybox1 annotated
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:255: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:259: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced:
(BSuccessful
message:Warning: kubectl apply should be used on resource created by either kubectl create --save-config or kubectl apply
pod/busybox0 configured
Warning: kubectl apply should be used on resource created by either kubectl create --save-config or kubectl apply
pod/busybox1 configured
error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
has:error validating data: kind not set
generic-resources.sh:265: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/nginx created
I0801 06:20:00.750168   58383 event.go:291] "Event occurred" object="namespace-1596262793-31662/nginx" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-54785cbcb8 to 3"
I0801 06:20:00.756959   58383 event.go:291] "Event occurred" object="namespace-1596262793-31662/nginx-54785cbcb8" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-54785cbcb8-dm99d"
I0801 06:20:00.764517   58383 event.go:291] "Event occurred" object="namespace-1596262793-31662/nginx-54785cbcb8" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-54785cbcb8-4qlpz"
I0801 06:20:00.771158   58383 event.go:291] "Event occurred" object="namespace-1596262793-31662/nginx-54785cbcb8" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-54785cbcb8-gqvl6"
E0801 06:20:00.841303   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:269: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx:
(BE0801 06:20:01.303175   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:270: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bkubectl convert is DEPRECATED and will be removed in a future version.
In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
generic-resources.sh:274: Successful get deployment nginx {{ .apiVersion }}: apps/v1
(BSuccessful
message:apiVersion: extensions/v1beta1
... skipping 37 lines ...
      securityContext: {}
      terminationGracePeriodSeconds: 30
status: {}
has:extensions/v1beta1
deployment.apps "nginx" deleted
Waiting for Get pods {{range.items}}{{.metadata.name}}:{{end}} : expected: busybox0:busybox1:, got: busybox0:busybox1:nginx-54785cbcb8-4qlpz:nginx-54785cbcb8-dm99d:nginx-54785cbcb8-gqvl6:
E0801 06:20:02.558145   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:281: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:285: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:kubectl convert is DEPRECATED and will be removed in a future version.
In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:290: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:busybox0:busybox1:
Successful
message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:299: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bpod/busybox0 labeled
pod/busybox1 labeled
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
generic-resources.sh:304: Successful get pods {{range.items}}{{.metadata.labels.mylabel}}:{{end}}: myvalue:myvalue:
(BSuccessful
message:pod/busybox0 labeled
pod/busybox1 labeled
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:309: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bpod/busybox0 patched
pod/busybox1 patched
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
generic-resources.sh:314: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: prom/busybox:prom/busybox:
(BSuccessful
message:pod/busybox0 patched
pod/busybox1 patched
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:319: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BI0801 06:20:05.952794   54838 client.go:360] parsed scheme: "passthrough"
I0801 06:20:05.952863   54838 passthrough.go:48] ccResolverWrapper: sending update to cc: {[{http://127.0.0.1:2379  <nil> 0 <nil>}] <nil> <nil>}
I0801 06:20:05.952878   54838 clientconn.go:948] ClientConn switching balancer to "pick_first"
generic-resources.sh:323: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "busybox0" force deleted
pod "busybox1" force deleted
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:328: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicationcontroller/busybox0 created
I0801 06:20:06.997119   58383 event.go:291] "Event occurred" object="namespace-1596262793-31662/busybox0" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: busybox0-252lc"
replicationcontroller/busybox1 created
error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0801 06:20:07.008758   58383 event.go:291] "Event occurred" object="namespace-1596262793-31662/busybox1" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: busybox1-ztk25"
generic-resources.sh:332: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:337: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0801 06:20:07.851052   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:338: Successful get rc busybox0 {{.spec.replicas}}: 1
(Bgeneric-resources.sh:339: Successful get rc busybox1 {{.spec.replicas}}: 1
(Bgeneric-resources.sh:344: Successful get hpa busybox0 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80
(Bgeneric-resources.sh:345: Successful get hpa busybox1 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80
(BSuccessful
message:horizontalpodautoscaler.autoscaling/busybox0 autoscaled
horizontalpodautoscaler.autoscaling/busybox1 autoscaled
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
horizontalpodautoscaler.autoscaling "busybox0" deleted
horizontalpodautoscaler.autoscaling "busybox1" deleted
E0801 06:20:09.604640   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:353: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:354: Successful get rc busybox0 {{.spec.replicas}}: 1
(Bgeneric-resources.sh:355: Successful get rc busybox1 {{.spec.replicas}}: 1
(Bgeneric-resources.sh:359: Successful get service busybox0 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
(Bgeneric-resources.sh:360: Successful get service busybox1 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
(BSuccessful
message:service/busybox0 exposed
service/busybox1 exposed
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
generic-resources.sh:366: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:367: Successful get rc busybox0 {{.spec.replicas}}: 1
(Bgeneric-resources.sh:368: Successful get rc busybox1 {{.spec.replicas}}: 1
(BI0801 06:20:12.360233   58383 event.go:291] "Event occurred" object="namespace-1596262793-31662/busybox0" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: busybox0-59w8s"
I0801 06:20:12.386532   58383 event.go:291] "Event occurred" object="namespace-1596262793-31662/busybox1" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: busybox1-4xdbb"
generic-resources.sh:372: Successful get rc busybox0 {{.spec.replicas}}: 2
(Bgeneric-resources.sh:373: Successful get rc busybox1 {{.spec.replicas}}: 2
(BSuccessful
message:replicationcontroller/busybox0 scaled
replicationcontroller/busybox1 scaled
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
generic-resources.sh:378: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0801 06:20:13.343205   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:382: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
replicationcontroller "busybox0" force deleted
replicationcontroller "busybox1" force deleted
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
generic-resources.sh:387: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/nginx1-deployment created
deployment.apps/nginx0-deployment created
error: error validating "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0801 06:20:14.647281   58383 event.go:291] "Event occurred" object="namespace-1596262793-31662/nginx1-deployment" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx1-deployment-758b5949b6 to 2"
I0801 06:20:14.653113   58383 event.go:291] "Event occurred" object="namespace-1596262793-31662/nginx1-deployment-758b5949b6" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx1-deployment-758b5949b6-gbjzv"
I0801 06:20:14.656634   58383 event.go:291] "Event occurred" object="namespace-1596262793-31662/nginx0-deployment" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx0-deployment-75db9cdfd9 to 2"
I0801 06:20:14.660118   58383 event.go:291] "Event occurred" object="namespace-1596262793-31662/nginx1-deployment-758b5949b6" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx1-deployment-758b5949b6-5lhjj"
I0801 06:20:14.665209   58383 event.go:291] "Event occurred" object="namespace-1596262793-31662/nginx0-deployment-75db9cdfd9" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx0-deployment-75db9cdfd9-79dm9"
I0801 06:20:14.671343   58383 event.go:291] "Event occurred" object="namespace-1596262793-31662/nginx0-deployment-75db9cdfd9" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx0-deployment-75db9cdfd9-lfhkv"
generic-resources.sh:391: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx0-deployment:nginx1-deployment:
(Bgeneric-resources.sh:392: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:k8s.gcr.io/nginx:1.7.9:
(Bgeneric-resources.sh:396: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:k8s.gcr.io/nginx:1.7.9:
(BSuccessful
message:deployment.apps/nginx1-deployment skipped rollback (current template already matches revision 1)
deployment.apps/nginx0-deployment skipped rollback (current template already matches revision 1)
error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:Object 'Kind' is missing
deployment.apps/nginx1-deployment paused
deployment.apps/nginx0-deployment paused
generic-resources.sh:404: Successful get deployment {{range.items}}{{.spec.paused}}:{{end}}: true:true:
(BSuccessful
message:unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
... skipping 10 lines ...
1         <none>

deployment.apps/nginx0-deployment 
REVISION  CHANGE-CAUSE
1         <none>

error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:nginx0-deployment
Successful
message:deployment.apps/nginx1-deployment 
REVISION  CHANGE-CAUSE
1         <none>

deployment.apps/nginx0-deployment 
REVISION  CHANGE-CAUSE
1         <none>

error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:nginx1-deployment
Successful
message:deployment.apps/nginx1-deployment 
REVISION  CHANGE-CAUSE
1         <none>

deployment.apps/nginx0-deployment 
REVISION  CHANGE-CAUSE
1         <none>

error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:Object 'Kind' is missing
warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
deployment.apps "nginx1-deployment" force deleted
deployment.apps "nginx0-deployment" force deleted
error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
generic-resources.sh:426: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicationcontroller/busybox0 created
I0801 06:20:19.511095   58383 event.go:291] "Event occurred" object="namespace-1596262793-31662/busybox0" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: busybox0-sqvgp"
replicationcontroller/busybox1 created
error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0801 06:20:19.525968   58383 event.go:291] "Event occurred" object="namespace-1596262793-31662/busybox1" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: busybox1-s98p8"
generic-resources.sh:430: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:no rollbacker has been implemented for "ReplicationController"
no rollbacker has been implemented for "ReplicationController"
unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
... skipping 2 lines ...
message:no rollbacker has been implemented for "ReplicationController"
no rollbacker has been implemented for "ReplicationController"
unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" pausing is not supported
error: replicationcontrollers "busybox1" pausing is not supported
has:Object 'Kind' is missing
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" pausing is not supported
error: replicationcontrollers "busybox1" pausing is not supported
has:replicationcontrollers "busybox0" pausing is not supported
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" pausing is not supported
error: replicationcontrollers "busybox1" pausing is not supported
has:replicationcontrollers "busybox1" pausing is not supported
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" resuming is not supported
error: replicationcontrollers "busybox1" resuming is not supported
has:Object 'Kind' is missing
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" resuming is not supported
error: replicationcontrollers "busybox1" resuming is not supported
has:replicationcontrollers "busybox0" resuming is not supported
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" resuming is not supported
error: replicationcontrollers "busybox1" resuming is not supported
has:replicationcontrollers "busybox1" resuming is not supported
warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
replicationcontroller "busybox0" force deleted
replicationcontroller "busybox1" force deleted
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
Recording: run_namespace_tests
Running command: run_namespace_tests

+++ Running case: test-cmd.run_namespace_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_namespace_tests
+++ [0801 06:20:22] Testing kubectl(v1:namespaces)
Successful
message:Error from server (NotFound): namespaces "my-namespace" not found
has: not found
namespace/my-namespace created (dry run)
E0801 06:20:23.014853   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0801 06:20:23.391568   58383 horizontal.go:354] Horizontal Pod Autoscaler busybox0 has been deleted in namespace-1596262793-31662
I0801 06:20:23.397418   58383 horizontal.go:354] Horizontal Pod Autoscaler busybox1 has been deleted in namespace-1596262793-31662
namespace/my-namespace created (server dry run)
Successful
message:Error from server (NotFound): namespaces "my-namespace" not found
has: not found
namespace/my-namespace created
core.sh:1459: Successful get namespaces/my-namespace {{.metadata.name}}: my-namespace
(Bnamespace "my-namespace" deleted
E0801 06:20:28.430077   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/my-namespace condition met
Successful
message:Error from server (NotFound): namespaces "my-namespace" not found
has: not found
namespace/my-namespace created
E0801 06:20:30.747036   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1468: Successful get namespaces/my-namespace {{.metadata.name}}: my-namespace
(BSuccessful
message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
namespace "kube-node-lease" deleted
namespace "my-namespace" deleted
namespace "namespace-1596262252-8014" deleted
... skipping 29 lines ...
namespace "namespace-1596262735-32446" deleted
namespace "namespace-1596262735-6186" deleted
namespace "namespace-1596262738-8867" deleted
namespace "namespace-1596262742-1763" deleted
namespace "namespace-1596262745-23646" deleted
namespace "namespace-1596262793-31662" deleted
Error from server (Forbidden): namespaces "default" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-public" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-system" is forbidden: this namespace may not be deleted
has:warning: deleting cluster-scoped resources
Successful
message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
namespace "kube-node-lease" deleted
namespace "my-namespace" deleted
namespace "namespace-1596262252-8014" deleted
... skipping 29 lines ...
namespace "namespace-1596262735-32446" deleted
namespace "namespace-1596262735-6186" deleted
namespace "namespace-1596262738-8867" deleted
namespace "namespace-1596262742-1763" deleted
namespace "namespace-1596262745-23646" deleted
namespace "namespace-1596262793-31662" deleted
Error from server (Forbidden): namespaces "default" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-public" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-system" is forbidden: this namespace may not be deleted
has:namespace "my-namespace" deleted
namespace/quotas created
core.sh:1475: Successful get namespaces/quotas {{.metadata.name}}: quotas
(Bcore.sh:1476: Successful get quota --namespace=quotas {{range.items}}{{ if eq .metadata.name \"test-quota\" }}found{{end}}{{end}}:: :
(Bresourcequota/test-quota created (dry run)
resourcequota/test-quota created (server dry run)
E0801 06:20:33.255434   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1480: Successful get quota --namespace=quotas {{range.items}}{{ if eq .metadata.name \"test-quota\" }}found{{end}}{{end}}:: :
(Bresourcequota/test-quota created
E0801 06:20:33.842168   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1483: Successful get quota --namespace=quotas {{range.items}}{{ if eq .metadata.name \"test-quota\" }}found{{end}}{{end}}:: found:
(BI0801 06:20:34.158448   58383 resource_quota_controller.go:306] Resource quota has been deleted quotas/test-quota
resourcequota "test-quota" deleted
namespace "quotas" deleted
core.sh:1495: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"other\" }}found{{end}}{{end}}:: :
(BI0801 06:20:40.014221   58383 namespace_controller.go:185] Namespace has been deleted my-namespace
... skipping 34 lines ...
I0801 06:20:42.427109   58383 namespace_controller.go:185] Namespace has been deleted namespace-1596262655-24677
I0801 06:20:42.430967   58383 namespace_controller.go:185] Namespace has been deleted namespace-1596262726-2958
I0801 06:20:42.481884   54838 client.go:360] parsed scheme: "passthrough"
I0801 06:20:42.482107   54838 passthrough.go:48] ccResolverWrapper: sending update to cc: {[{http://127.0.0.1:2379  <nil> 0 <nil>}] <nil> <nil>}
I0801 06:20:42.482122   54838 clientconn.go:948] ClientConn switching balancer to "pick_first"
Successful
message:error: a resource cannot be retrieved by name across all namespaces
has:a resource cannot be retrieved by name across all namespaces
I0801 06:20:42.625087   58383 namespace_controller.go:185] Namespace has been deleted namespace-1596262735-32446
I0801 06:20:42.651422   58383 namespace_controller.go:185] Namespace has been deleted namespace-1596262735-6186
I0801 06:20:42.678029   58383 namespace_controller.go:185] Namespace has been deleted namespace-1596262727-10920
I0801 06:20:42.680773   58383 namespace_controller.go:185] Namespace has been deleted namespace-1596262745-23646
I0801 06:20:42.691336   58383 namespace_controller.go:185] Namespace has been deleted namespace-1596262738-8867
... skipping 67 lines ...
(Bcore.sh:851: Successful get secret/test-secret --namespace=test-secrets {{.type}}: kubernetes.io/dockerconfigjson
(BI0801 06:20:53.805664   58383 namespace_controller.go:185] Namespace has been deleted other
secret "test-secret" deleted
core.sh:861: Successful get secrets --namespace=test-secrets {{range.items}}{{.metadata.name}}:{{end}}: 
(Bsecret/test-secret created
core.sh:865: Successful get secret/test-secret --namespace=test-secrets {{.metadata.name}}: test-secret
(BE0801 06:20:55.173676   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:866: Successful get secret/test-secret --namespace=test-secrets {{.type}}: kubernetes.io/dockerconfigjson
(Bsecret "test-secret" deleted
core.sh:876: Successful get secrets --namespace=test-secrets {{range.items}}{{.metadata.name}}:{{end}}: 
(Bsecret/test-secret created
core.sh:879: Successful get secret/test-secret --namespace=test-secrets {{.metadata.name}}: test-secret
(Bcore.sh:880: Successful get secret/test-secret --namespace=test-secrets {{.type}}: kubernetes.io/tls
... skipping 4 lines ...
(Bsecret "test-secret" deleted
secret/secret-string-data created
core.sh:909: Successful get secret/secret-string-data --namespace=test-secrets  {{.data}}: map[k1:djE= k2:djI=]
(Bcore.sh:910: Successful get secret/secret-string-data --namespace=test-secrets  {{.data}}: map[k1:djE= k2:djI=]
(Bcore.sh:911: Successful get secret/secret-string-data --namespace=test-secrets  {{.stringData}}: <no value>
(Bsecret "secret-string-data" deleted
E0801 06:21:00.401433   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:920: Successful get secrets --namespace=test-secrets {{range.items}}{{.metadata.name}}:{{end}}: 
(Bsecret "test-secret" deleted
namespace "test-secrets" deleted
+++ exit code: 0
Recording: run_configmap_tests
Running command: run_configmap_tests
... skipping 9 lines ...
core.sh:28: Successful get configmap/test-configmap {{.metadata.name}}: test-configmap
(Bconfigmap "test-configmap" deleted
core.sh:33: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-configmaps\" }}found{{end}}{{end}}:: :
(Bnamespace/test-configmaps created
core.sh:37: Successful get namespaces/test-configmaps {{.metadata.name}}: test-configmaps
(Bcore.sh:41: Successful get configmaps {{range.items}}{{ if eq .metadata.name \"test-configmap\" }}found{{end}}{{end}}:: :
(BE0801 06:21:09.783971   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:42: Successful get configmaps {{range.items}}{{ if eq .metadata.name \"test-binary-configmap\" }}found{{end}}{{end}}:: :
(Bconfigmap/test-configmap created (dry run)
configmap/test-configmap created (server dry run)
core.sh:46: Successful get configmaps {{range.items}}{{ if eq .metadata.name \"test-configmap\" }}found{{end}}{{end}}:: :
(BI0801 06:21:11.289548   58383 namespace_controller.go:185] Namespace has been deleted test-secrets
configmap/test-configmap created
... skipping 15 lines ...
+++ command: run_client_config_tests
+++ [0801 06:21:18] Creating namespace namespace-1596262878-29032
namespace/namespace-1596262878-29032 created
Context "test" modified.
+++ [0801 06:21:19] Testing client config
Successful
message:error: stat missing: no such file or directory
has:missing: no such file or directory
Successful
message:error: stat missing: no such file or directory
has:missing: no such file or directory
E0801 06:21:20.032027   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: stat missing: no such file or directory
has:missing: no such file or directory
Successful
message:Error in configuration: context was not found for specified context: missing-context
has:context was not found for specified context: missing-context
Successful
message:error: no server found for cluster "missing-cluster"
has:no server found for cluster "missing-cluster"
Successful
message:error: auth info "missing-user" does not exist
has:auth info "missing-user" does not exist
Successful
message:error: error loading config file "/tmp/newconfig.yaml": no kind "Config" is registered for version "v-1" in scheme "k8s.io/client-go/tools/clientcmd/api/latest/latest.go:50"
has:error loading config file
Successful
message:error: stat missing-config: no such file or directory
has:no such file or directory
+++ exit code: 0
Recording: run_service_accounts_tests
Running command: run_service_accounts_tests

+++ Running case: test-cmd.run_service_accounts_tests 
... skipping 15 lines ...
core.sh:957: Successful get serviceaccount/test-service-account --namespace=test-service-accounts {{.metadata.name}}: test-service-account
(Bserviceaccount "test-service-account" deleted
namespace "test-service-accounts" deleted
I0801 06:21:31.350066   54838 trace.go:205] Trace[1551449506]: "Get" url:/api/v1/namespaces/kube-system/endpoints/kube-controller-manager,user-agent:kube-controller-manager/v1.20.0 (linux/amd64) kubernetes/2a565fd/leader-election,client:127.0.0.1 (01-Aug-2020 06:21:30.773) (total time: 576ms):
Trace[1551449506]: ---"About to write a response" 576ms (06:21:00.349)
Trace[1551449506]: [576.646322ms] [576.646322ms] END
E0801 06:21:31.974303   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0801 06:21:32.398700   54838 trace.go:205] Trace[955834881]: "Get" url:/api/v1/namespaces/test-service-accounts,user-agent:kube-controller-manager/v1.20.0 (linux/amd64) kubernetes/2a565fd/namespace-controller,client:127.0.0.1 (01-Aug-2020 06:21:31.702) (total time: 696ms):
Trace[955834881]: ---"About to write a response" 696ms (06:21:00.398)
Trace[955834881]: [696.603466ms] [696.603466ms] END
+++ exit code: 0
Recording: run_job_tests
Running command: run_job_tests
... skipping 22 lines ...
Labels:                        <none>
Annotations:                   <none>
Schedule:                      59 23 31 2 *
Concurrency Policy:            Allow
Suspend:                       False
Successful Job History Limit:  3
Failed Job History Limit:      1
Starting Deadline Seconds:     <unset>
Selector:                      <unset>
Parallelism:                   <unset>
Completions:                   <unset>
Pod Template:
  Labels:  <none>
... skipping 40 lines ...
Labels:         controller-uid=96f96961-a007-429e-bcb1-11d8faf41b82
                job-name=test-job
Annotations:    cronjob.kubernetes.io/instantiate: manual
Parallelism:    1
Completions:    1
Start Time:     Sat, 01 Aug 2020 06:21:40 +0000
Pods Statuses:  1 Running / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  controller-uid=96f96961-a007-429e-bcb1-11d8faf41b82
           job-name=test-job
  Containers:
   pi:
    Image:      k8s.gcr.io/perl
... skipping 14 lines ...
  Type    Reason            Age   From            Message
  ----    ------            ----  ----            -------
  Normal  SuccessfulCreate  1s    job-controller  Created pod: test-job-wcgmn
job.batch "test-job" deleted
cronjob.batch "pi" deleted
namespace "test-jobs" deleted
E0801 06:21:46.820628   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_create_job_tests
Running command: run_create_job_tests

+++ Running case: test-cmd.run_create_job_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_create_job_tests
+++ [0801 06:21:47] Creating namespace namespace-1596262907-30932
namespace/namespace-1596262907-30932 created
Context "test" modified.
I0801 06:21:48.095690   58383 event.go:291] "Event occurred" object="namespace-1596262907-30932/test-job" kind="Job" apiVersion="batch/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: test-job-jt2cs"
job.batch/test-job created
create.sh:106: Successful get job test-job {{(index .spec.template.spec.containers 0).image}}: k8s.gcr.io/nginx:test-cmd
(BE0801 06:21:48.505459   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
job.batch "test-job" deleted
job.batch/test-job-pi created
I0801 06:21:48.911627   58383 event.go:291] "Event occurred" object="namespace-1596262907-30932/test-job-pi" kind="Job" apiVersion="batch/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: test-job-pi-5d2f9"
create.sh:112: Successful get job test-job-pi {{(index .spec.template.spec.containers 0).image}}: k8s.gcr.io/perl
(Bjob.batch "test-job-pi" deleted
cronjob.batch/test-pi created
... skipping 14 lines ...
+++ [0801 06:21:51] Creating namespace namespace-1596262911-8468
namespace/namespace-1596262911-8468 created
Context "test" modified.
+++ [0801 06:21:51] Testing pod templates
core.sh:1581: Successful get podtemplates {{range.items}}{{.metadata.name}}:{{end}}: 
(BI0801 06:21:52.024353   58383 namespace_controller.go:185] Namespace has been deleted test-jobs
E0801 06:21:52.371947   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0801 06:21:52.378700   54838 controller.go:606] quota admission added evaluator for: podtemplates
podtemplate/nginx created
E0801 06:21:52.515764   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1585: Successful get podtemplates {{range.items}}{{.metadata.name}}:{{end}}: nginx:
(BNAME    CONTAINERS   IMAGES   POD LABELS
nginx   nginx        nginx    name=nginx
core.sh:1593: Successful get podtemplates {{range.items}}{{.metadata.name}}:{{end}}: nginx:
(Bpodtemplate "nginx" deleted
core.sh:1597: Successful get podtemplate {{range.items}}{{.metadata.name}}:{{end}}: 
... skipping 399 lines ...
  type: ClusterIP
status:
  loadBalancer: {}
Successful
message:kubectl-create kubectl-set
has:kubectl-set
error: you must specify resources by --filename when --local is set.
Example resource specifications include:
   '-f rsrc.yaml'
   '--filename=rsrc.json'
core.sh:1020: Successful get services redis-master {{range.spec.selector}}{{.}}:{{end}}: redis:master:backend:
(Bservice/redis-master selector updated
Successful
message:Error from server (Conflict): Operation cannot be fulfilled on services "redis-master": the object has been modified; please apply your changes to the latest version and try again
has:Conflict
core.sh:1033: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
(Bservice "redis-master" deleted
core.sh:1040: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bcore.sh:1044: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bservice/redis-master created
... skipping 20 lines ...
(Bservice "redis-master" deleted
service "redis-slave" deleted
core.sh:1114: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bcore.sh:1118: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bservice/beep-boop created (dry run)
service/beep-boop created (server dry run)
E0801 06:22:12.438113   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1122: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bservice/beep-boop created
core.sh:1126: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: beep-boop:kubernetes:
(Bcore.sh:1130: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: beep-boop:kubernetes:
(Bservice "beep-boop" deleted
core.sh:1137: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
... skipping 65 lines ...
+++ [0801 06:22:24] Testing kubectl(v1:daemonsets, v1:controllerrevisions)
apps.sh:69: Successful get daemonsets {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdaemonset.apps/bind created
apps.sh:73: Successful get controllerrevisions {{range.items}}{{.metadata.annotations}}:{{end}}: map[deprecated.daemonset.template.generation:1 kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"DaemonSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"service":"bind"},"name":"bind","namespace":"namespace-1596262943-32311"},"spec":{"selector":{"matchLabels":{"service":"bind"}},"template":{"metadata":{"labels":{"service":"bind"}},"spec":{"affinity":{"podAntiAffinity":{"requiredDuringSchedulingIgnoredDuringExecution":[{"labelSelector":{"matchExpressions":[{"key":"service","operator":"In","values":["bind"]}]},"namespaces":[],"topologyKey":"kubernetes.io/hostname"}]}},"containers":[{"image":"k8s.gcr.io/pause:2.0","name":"kubernetes-pause"}]}},"updateStrategy":{"rollingUpdate":{"maxUnavailable":"10%"},"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-daemonset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:
(Bdaemonset.apps/bind skipped rollback (current template already matches revision 1)
E0801 06:22:27.433273   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:76: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(Bapps.sh:77: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(Bdaemonset.apps/bind configured
apps.sh:80: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:latest:
(Bapps.sh:81: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE0801 06:22:30.138308   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:82: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(Bapps.sh:83: Successful get controllerrevisions {{range.items}}{{.metadata.annotations}}:{{end}}: map[deprecated.daemonset.template.generation:2 kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"DaemonSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"service":"bind"},"name":"bind","namespace":"namespace-1596262943-32311"},"spec":{"selector":{"matchLabels":{"service":"bind"}},"template":{"metadata":{"labels":{"service":"bind"}},"spec":{"affinity":{"podAntiAffinity":{"requiredDuringSchedulingIgnoredDuringExecution":[{"labelSelector":{"matchExpressions":[{"key":"service","operator":"In","values":["bind"]}]},"namespaces":[],"topologyKey":"kubernetes.io/hostname"}]}},"containers":[{"image":"k8s.gcr.io/pause:latest","name":"kubernetes-pause"},{"image":"k8s.gcr.io/nginx:test-cmd","name":"app"}]}},"updateStrategy":{"rollingUpdate":{"maxUnavailable":"10%"},"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:map[deprecated.daemonset.template.generation:1 kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"DaemonSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"service":"bind"},"name":"bind","namespace":"namespace-1596262943-32311"},"spec":{"selector":{"matchLabels":{"service":"bind"}},"template":{"metadata":{"labels":{"service":"bind"}},"spec":{"affinity":{"podAntiAffinity":{"requiredDuringSchedulingIgnoredDuringExecution":[{"labelSelector":{"matchExpressions":[{"key":"service","operator":"In","values":["bind"]}]},"namespaces":[],"topologyKey":"kubernetes.io/hostname"}]}},"containers":[{"image":"k8s.gcr.io/pause:2.0","name":"kubernetes-pause"}]}},"updateStrategy":{"rollingUpdate":{"maxUnavailable":"10%"},"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-daemonset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:
(BE0801 06:22:30.620478   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps/bind will roll back to Pod Template:
  Labels:	service=bind
  Containers:
   kubernetes-pause:
    Image:	k8s.gcr.io/pause:2.0
    Port:	<none>
... skipping 4 lines ...
 (dry run)
daemonset.apps/bind rolled back (server dry run)
apps.sh:87: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:latest:
(Bapps.sh:88: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:89: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(Bdaemonset.apps/bind rolled back
E0801 06:22:33.049039   58383 daemon_controller.go:320] namespace-1596262943-32311/bind failed with : error storing status for daemon set &v1.DaemonSet{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"bind", GenerateName:"", Namespace:"namespace-1596262943-32311", SelfLink:"/apis/apps/v1/namespaces/namespace-1596262943-32311/daemonsets/bind", UID:"0d74f48f-d29d-4c00-9a33-545ac5318a2d", ResourceVersion:"2472", Generation:3, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63731859745, loc:(*time.Location)(0x6a38ca0)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string{"deprecated.daemonset.template.generation":"3", "kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"DaemonSet\",\"metadata\":{\"annotations\":{\"kubernetes.io/change-cause\":\"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true\"},\"labels\":{\"service\":\"bind\"},\"name\":\"bind\",\"namespace\":\"namespace-1596262943-32311\"},\"spec\":{\"selector\":{\"matchLabels\":{\"service\":\"bind\"}},\"template\":{\"metadata\":{\"labels\":{\"service\":\"bind\"}},\"spec\":{\"affinity\":{\"podAntiAffinity\":{\"requiredDuringSchedulingIgnoredDuringExecution\":[{\"labelSelector\":{\"matchExpressions\":[{\"key\":\"service\",\"operator\":\"In\",\"values\":[\"bind\"]}]},\"namespaces\":[],\"topologyKey\":\"kubernetes.io/hostname\"}]}},\"containers\":[{\"image\":\"k8s.gcr.io/pause:latest\",\"name\":\"kubernetes-pause\"},{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"app\"}]}},\"updateStrategy\":{\"rollingUpdate\":{\"maxUnavailable\":\"10%\"},\"type\":\"RollingUpdate\"}}}\n", "kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"}, OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry{v1.ManagedFieldsEntry{Manager:"kube-controller-manager", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc00131d5c0), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc00131d600)}, v1.ManagedFieldsEntry{Manager:"kubectl-client-side-apply", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc00131d640), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc00131d680)}, v1.ManagedFieldsEntry{Manager:"kubectl", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc00131d6a0), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc00131d6c0)}}}, Spec:v1.DaemonSetSpec{Selector:(*v1.LabelSelector)(0xc00131d6e0), Template:v1.PodTemplateSpec{ObjectMeta:v1.ObjectMeta{Name:"", GenerateName:"", Namespace:"", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:0, loc:(*time.Location)(nil)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"kubernetes-pause", Image:"k8s.gcr.io/pause:2.0", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]v1.EphemeralContainer(nil), RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc0021cd3d8), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc0006d1490), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(0xc00131d700), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration(nil), HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(nil), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(nil), PreemptionPolicy:(*v1.PreemptionPolicy)(nil), Overhead:v1.ResourceList(nil), TopologySpreadConstraints:[]v1.TopologySpreadConstraint(nil), SetHostnameAsFQDN:(*bool)(nil)}}, UpdateStrategy:v1.DaemonSetUpdateStrategy{Type:"RollingUpdate", RollingUpdate:(*v1.RollingUpdateDaemonSet)(0xc00029a4f8)}, MinReadySeconds:0, RevisionHistoryLimit:(*int32)(0xc0021cd42c)}, Status:v1.DaemonSetStatus{CurrentNumberScheduled:0, NumberMisscheduled:0, DesiredNumberScheduled:0, NumberReady:0, ObservedGeneration:2, UpdatedNumberScheduled:0, NumberAvailable:0, NumberUnavailable:0, CollisionCount:(*int32)(nil), Conditions:[]v1.DaemonSetCondition(nil)}}: Operation cannot be fulfilled on daemonsets.apps "bind": the object has been modified; please apply your changes to the latest version and try again
apps.sh:92: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(Bapps.sh:93: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(BSuccessful
message:error: unable to find specified revision 1000000 in history
has:unable to find specified revision
apps.sh:97: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(BI0801 06:22:35.157961   54838 client.go:360] parsed scheme: "passthrough"
I0801 06:22:35.158033   54838 passthrough.go:48] ccResolverWrapper: sending update to cc: {[{http://127.0.0.1:2379  <nil> 0 <nil>}] <nil> <nil>}
I0801 06:22:35.158048   54838 clientconn.go:948] ClientConn switching balancer to "pick_first"
apps.sh:98: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
... skipping 39 lines ...
Namespace:    namespace-1596262957-29810
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
Namespace:    namespace-1596262957-29810
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 18 lines ...
Namespace:    namespace-1596262957-29810
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 12 lines ...
Namespace:    namespace-1596262957-29810
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 27 lines ...
Namespace:    namespace-1596262957-29810
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
Namespace:    namespace-1596262957-29810
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
Namespace:    namespace-1596262957-29810
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 11 lines ...
Namespace:    namespace-1596262957-29810
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 10 lines ...
  Type    Reason            Age   From                    Message
  ----    ------            ----  ----                    -------
  Normal  SuccessfulCreate  5s    replication-controller  Created pod: frontend-kvdw8
  Normal  SuccessfulCreate  5s    replication-controller  Created pod: frontend-sfkr6
  Normal  SuccessfulCreate  5s    replication-controller  Created pod: frontend-dmwwl
(Bcore.sh:1224: Successful get rc frontend {{.spec.replicas}}: 3
(BE0801 06:22:44.594569   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/frontend scaled
E0801 06:22:44.640116   58383 replica_set.go:201] ReplicaSet has no controller: &ReplicaSet{ObjectMeta:{frontend  namespace-1596262957-29810 /api/v1/namespaces/namespace-1596262957-29810/replicationcontrollers/frontend 26682f5f-3572-4fa0-9d16-479991fd8be6 2519 2 2020-08-01 06:22:39 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  [{kube-controller-manager Update v1 2020-08-01 06:22:39 +0000 UTC FieldsV1 {"f:status":{"f:fullyLabeledReplicas":{},"f:observedGeneration":{},"f:replicas":{}}}} {kubectl-create Update v1 2020-08-01 06:22:39 +0000 UTC FieldsV1 {"f:metadata":{"f:labels":{".":{},"f:app":{},"f:tier":{}}},"f:spec":{"f:replicas":{},"f:selector":{".":{},"f:app":{},"f:tier":{}},"f:template":{".":{},"f:metadata":{".":{},"f:creationTimestamp":{},"f:labels":{".":{},"f:app":{},"f:tier":{}}},"f:spec":{".":{},"f:containers":{".":{},"k:{\"name\":\"php-redis\"}":{".":{},"f:env":{".":{},"k:{\"name\":\"GET_HOSTS_FROM\"}":{".":{},"f:name":{},"f:value":{}}},"f:image":{},"f:imagePullPolicy":{},"f:name":{},"f:ports":{".":{},"k:{\"containerPort\":80,\"protocol\":\"TCP\"}":{".":{},"f:containerPort":{},"f:protocol":{}}},"f:resources":{".":{},"f:requests":{".":{},"f:cpu":{},"f:memory":{}}},"f:terminationMessagePath":{},"f:terminationMessagePolicy":{}}},"f:dnsPolicy":{},"f:restartPolicy":{},"f:schedulerName":{},"f:securityContext":{},"f:terminationGracePeriodSeconds":{}}}}}}]},Spec:ReplicaSetSpec{Replicas:*2,Selector:&v1.LabelSelector{MatchLabels:map[string]string{app: guestbook,tier: frontend,},MatchExpressions:[]LabelSelectorRequirement{},},Template:{{      0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []} {[] [] [{php-redis gcr.io/google_samples/gb-frontend:v4 [] []  [{ 0 80 TCP }] [] [{GET_HOSTS_FROM dns nil}] {map[] map[cpu:{{100 -3} {<nil>} 100m DecimalSI} memory:{{104857600 0} {<nil>} 100Mi BinarySI}]} [] [] nil nil nil nil /dev/termination-log File IfNotPresent nil false false false}] [] Always 0xc001fd8718 <nil> ClusterFirst map[]   <nil>  false false false <nil> PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,FSGroupChangePolicy:nil,SeccompProfile:nil,} []   nil default-scheduler [] []  <nil> nil [] <nil> <nil> <nil> map[] [] <nil>}},MinReadySeconds:0,},Status:ReplicaSetStatus{Replicas:3,FullyLabeledReplicas:3,ObservedGeneration:1,ReadyReplicas:0,AvailableReplicas:0,Conditions:[]ReplicaSetCondition{},},}
I0801 06:22:44.650202   58383 event.go:291] "Event occurred" object="namespace-1596262957-29810/frontend" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulDelete" message="Deleted pod: frontend-kvdw8"
core.sh:1228: Successful get rc frontend {{.spec.replicas}}: 2
(Bcore.sh:1232: Successful get rc frontend {{.spec.replicas}}: 2
(Berror: Expected replicas to be 3, was 2
core.sh:1236: Successful get rc frontend {{.spec.replicas}}: 2
(Bcore.sh:1240: Successful get rc frontend {{.spec.replicas}}: 2
(Breplicationcontroller/frontend scaled
I0801 06:22:45.898230   58383 event.go:291] "Event occurred" object="namespace-1596262957-29810/frontend" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-fhpct"
core.sh:1244: Successful get rc frontend {{.spec.replicas}}: 3
(Bcore.sh:1248: Successful get rc frontend {{.spec.replicas}}: 3
... skipping 31 lines ...
(Bdeployment.apps "nginx-deployment" deleted
Successful
message:service/expose-test-deployment exposed
has:service/expose-test-deployment exposed
service "expose-test-deployment" deleted
Successful
message:error: couldn't retrieve selectors via --selector flag or introspection: invalid deployment: no selectors, therefore cannot be exposed
See 'kubectl expose -h' for help and examples
has:invalid deployment: no selectors
deployment.apps/nginx-deployment created
I0801 06:22:53.294249   58383 event.go:291] "Event occurred" object="namespace-1596262957-29810/nginx-deployment" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-deployment-76b5cd66f5 to 3"
I0801 06:22:53.303038   58383 event.go:291] "Event occurred" object="namespace-1596262957-29810/nginx-deployment-76b5cd66f5" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-76b5cd66f5-zdbw6"
I0801 06:22:53.308618   58383 event.go:291] "Event occurred" object="namespace-1596262957-29810/nginx-deployment-76b5cd66f5" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-76b5cd66f5-sjcxp"
... skipping 23 lines ...
service "frontend" deleted
service "frontend-2" deleted
service "frontend-3" deleted
service "frontend-4" deleted
service "frontend-5" deleted
Successful
message:error: cannot expose a Node
has:cannot expose
Successful
message:The Service "invalid-large-service-name-that-has-more-than-sixty-three-characters" is invalid: metadata.name: Invalid value: "invalid-large-service-name-that-has-more-than-sixty-three-characters": must be no more than 63 characters
has:metadata.name: Invalid value
Successful
message:service/kubernetes-serve-hostname-testing-sixty-three-characters-in-len exposed
... skipping 33 lines ...
(Bhorizontalpodautoscaler.autoscaling/frontend autoscaled
core.sh:1391: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 70
(Bhorizontalpodautoscaler.autoscaling "frontend" deleted
horizontalpodautoscaler.autoscaling/frontend autoscaled
core.sh:1395: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
(Bhorizontalpodautoscaler.autoscaling "frontend" deleted
Error: required flag(s) "max" not set
replicationcontroller "frontend" deleted
core.sh:1404: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BapiVersion: apps/v1
kind: Deployment
metadata:
  creationTimestamp: null
... skipping 24 lines ...
          limits:
            cpu: 300m
          requests:
            cpu: 300m
      terminationGracePeriodSeconds: 0
status: {}
Error from server (NotFound): deployments.apps "nginx-deployment-resources" not found
deployment.apps/nginx-deployment-resources created
I0801 06:23:10.079008   58383 event.go:291] "Event occurred" object="namespace-1596262957-29810/nginx-deployment-resources" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-deployment-resources-748ddcb48b to 3"
I0801 06:23:10.096200   58383 event.go:291] "Event occurred" object="namespace-1596262957-29810/nginx-deployment-resources-748ddcb48b" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-resources-748ddcb48b-rvc48"
I0801 06:23:10.101554   58383 event.go:291] "Event occurred" object="namespace-1596262957-29810/nginx-deployment-resources-748ddcb48b" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-resources-748ddcb48b-mnkv9"
I0801 06:23:10.106140   58383 event.go:291] "Event occurred" object="namespace-1596262957-29810/nginx-deployment-resources-748ddcb48b" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-resources-748ddcb48b-zfwm2"
core.sh:1410: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment-resources:
(Bcore.sh:1411: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bcore.sh:1412: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Bdeployment.apps/nginx-deployment-resources resource requirements updated
I0801 06:23:11.354890   58383 event.go:291] "Event occurred" object="namespace-1596262957-29810/nginx-deployment-resources" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-deployment-resources-7bfb7d56b6 to 1"
I0801 06:23:11.372566   58383 event.go:291] "Event occurred" object="namespace-1596262957-29810/nginx-deployment-resources-7bfb7d56b6" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-resources-7bfb7d56b6-lsl6z"
core.sh:1415: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 100m:
(BE0801 06:23:12.155129   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1416: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 100m:
(Berror: unable to find container named redis
deployment.apps/nginx-deployment-resources resource requirements updated
I0801 06:23:13.428887   58383 event.go:291] "Event occurred" object="namespace-1596262957-29810/nginx-deployment-resources" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled down replica set nginx-deployment-resources-748ddcb48b to 2"
I0801 06:23:13.625608   58383 event.go:291] "Event occurred" object="namespace-1596262957-29810/nginx-deployment-resources-748ddcb48b" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulDelete" message="Deleted pod: nginx-deployment-resources-748ddcb48b-rvc48"
I0801 06:23:13.643696   58383 event.go:291] "Event occurred" object="namespace-1596262957-29810/nginx-deployment-resources" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-deployment-resources-75dbcccf44 to 1"
core.sh:1421: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
(BI0801 06:23:13.674982   58383 event.go:291] "Event occurred" object="namespace-1596262957-29810/nginx-deployment-resources-75dbcccf44" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-resources-75dbcccf44-62kkj"
core.sh:1422: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 100m:
(Bdeployment.apps/nginx-deployment-resources resource requirements updated
I0801 06:23:15.218723   58383 event.go:291] "Event occurred" object="namespace-1596262957-29810/nginx-deployment-resources" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled down replica set nginx-deployment-resources-748ddcb48b to 1"
I0801 06:23:15.267276   58383 event.go:291] "Event occurred" object="namespace-1596262957-29810/nginx-deployment-resources-748ddcb48b" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulDelete" message="Deleted pod: nginx-deployment-resources-748ddcb48b-zfwm2"
I0801 06:23:15.270407   58383 event.go:291] "Event occurred" object="namespace-1596262957-29810/nginx-deployment-resources" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-deployment-resources-975b967f to 1"
I0801 06:23:15.279186   58383 event.go:291] "Event occurred" object="namespace-1596262957-29810/nginx-deployment-resources-975b967f" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-resources-975b967f-f8qff"
E0801 06:23:15.297290   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1425: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
(Bcore.sh:1426: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 300m:
(Bcore.sh:1427: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.requests.cpu}}:{{end}}: 300m:
(BapiVersion: apps/v1
kind: Deployment
metadata:
... skipping 375 lines ...
    status: "True"
    type: Progressing
  observedGeneration: 4
  replicas: 4
  unavailableReplicas: 4
  updatedReplicas: 1
error: you must specify resources by --filename when --local is set.
Example resource specifications include:
   '-f rsrc.yaml'
   '--filename=rsrc.json'
core.sh:1432: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
(Bcore.sh:1433: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 300m:
(Bcore.sh:1434: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.requests.cpu}}:{{end}}: 300m:
... skipping 21 lines ...
has:apps/v1
deployment.apps "test-nginx-extensions" deleted
I0801 06:23:20.859157   58383 event.go:291] "Event occurred" object="namespace-1596262998-22509/test-nginx-apps" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set test-nginx-apps-69dd6dcd84 to 1"
I0801 06:23:20.885593   58383 event.go:291] "Event occurred" object="namespace-1596262998-22509/test-nginx-apps-69dd6dcd84" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: test-nginx-apps-69dd6dcd84-5wwwx"
deployment.apps/test-nginx-apps created
apps.sh:202: Successful get deploy test-nginx-apps {{(index .spec.template.spec.containers 0).name}}: nginx
(BE0801 06:23:21.385011   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0801 06:23:21.584733   58383 horizontal.go:354] Horizontal Pod Autoscaler frontend has been deleted in namespace-1596262957-29810
Successful
message:10
has:10
Successful
message:apps/v1
... skipping 14 lines ...
                pod-template-hash=69dd6dcd84
Annotations:    deployment.kubernetes.io/desired-replicas: 1
                deployment.kubernetes.io/max-replicas: 2
                deployment.kubernetes.io/revision: 1
Controlled By:  Deployment/test-nginx-apps
Replicas:       1 current / 1 desired
Pods Status:    0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=test-nginx-apps
           pod-template-hash=69dd6dcd84
  Containers:
   nginx:
    Image:        k8s.gcr.io/nginx:test-cmd
... skipping 44 lines ...
(BI0801 06:23:25.758539   58383 event.go:291] "Event occurred" object="namespace-1596262998-22509/nginx-with-command" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-with-command-978c69cb6 to 1"
I0801 06:23:25.774678   58383 event.go:291] "Event occurred" object="namespace-1596262998-22509/nginx-with-command-978c69cb6" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-with-command-978c69cb6-w9282"
deployment.apps/nginx-with-command created
apps.sh:226: Successful get deploy nginx-with-command {{(index .spec.template.spec.containers 0).name}}: nginx
(Bdeployment.apps "nginx-with-command" deleted
apps.sh:232: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0801 06:23:27.711984   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0801 06:23:28.360632   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/deployment-with-unixuserid created
I0801 06:23:28.435066   58383 event.go:291] "Event occurred" object="namespace-1596262998-22509/deployment-with-unixuserid" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set deployment-with-unixuserid-6859c8dfcb to 1"
I0801 06:23:28.443414   58383 event.go:291] "Event occurred" object="namespace-1596262998-22509/deployment-with-unixuserid-6859c8dfcb" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: deployment-with-unixuserid-6859c8dfcb-qwxbf"
apps.sh:236: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: deployment-with-unixuserid:
(Bdeployment.apps "deployment-with-unixuserid" deleted
apps.sh:243: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
... skipping 27 lines ...
apps.sh:282: Successful get hpa {{range.items}}{{ if eq .metadata.name \"nginx-deployment\" }}found{{end}}{{end}}:: :
(Bhorizontalpodautoscaler.autoscaling/nginx-deployment autoscaled
apps.sh:285: Successful get hpa nginx-deployment {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
(Bhorizontalpodautoscaler.autoscaling "nginx-deployment" deleted
deployment.apps "nginx-deployment" deleted
apps.sh:293: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0801 06:23:42.408727   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx created
I0801 06:23:42.500272   58383 event.go:291] "Event occurred" object="namespace-1596262998-22509/nginx" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-54785cbcb8 to 3"
I0801 06:23:42.514193   58383 event.go:291] "Event occurred" object="namespace-1596262998-22509/nginx-54785cbcb8" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-54785cbcb8-dcjcp"
I0801 06:23:42.520114   58383 event.go:291] "Event occurred" object="namespace-1596262998-22509/nginx-54785cbcb8" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-54785cbcb8-k24pt"
I0801 06:23:42.521146   58383 event.go:291] "Event occurred" object="namespace-1596262998-22509/nginx-54785cbcb8" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-54785cbcb8-mz9lv"
apps.sh:297: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx:
... skipping 7 lines ...
apps.sh:304: Successful get deployment.apps {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(B    Image:	k8s.gcr.io/nginx:test-cmd
deployment.apps/nginx rolled back (server dry run)
apps.sh:308: Successful get deployment.apps {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(Bdeployment.apps/nginx rolled back
apps.sh:312: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Berror: unable to find specified revision 1000000 in history
I0801 06:23:47.949511   54838 client.go:360] parsed scheme: "passthrough"
I0801 06:23:47.949613   54838 passthrough.go:48] ccResolverWrapper: sending update to cc: {[{http://127.0.0.1:2379  <nil> 0 <nil>}] <nil> <nil>}
I0801 06:23:47.949631   54838 clientconn.go:948] ClientConn switching balancer to "pick_first"
apps.sh:315: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bdeployment.apps/nginx rolled back
E0801 06:23:48.398191   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:319: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(Bdeployment.apps/nginx paused
error: you cannot rollback a paused deployment; resume it first with 'kubectl rollout resume deployment/nginx' and try again
error: deployments.apps "nginx" can't restart paused deployment (run rollout resume first)
deployment.apps/nginx resumed
deployment.apps/nginx rolled back
    deployment.kubernetes.io/revision-history: 1,3
error: desired revision (3) is different from the running revision (5)
deployment.apps/nginx restarted
I0801 06:23:52.730094   58383 event.go:291] "Event occurred" object="namespace-1596262998-22509/nginx" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled down replica set nginx-54785cbcb8 to 2"
I0801 06:23:52.929517   58383 event.go:291] "Event occurred" object="namespace-1596262998-22509/nginx" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-668db8d6fb to 1"
I0801 06:23:52.929620   58383 event.go:291] "Event occurred" object="namespace-1596262998-22509/nginx-54785cbcb8" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulDelete" message="Deleted pod: nginx-54785cbcb8-dcjcp"
I0801 06:23:53.236586   58383 event.go:291] "Event occurred" object="namespace-1596262998-22509/nginx-668db8d6fb" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-668db8d6fb-2x5pm"
Successful
... skipping 150 lines ...
(Bapps.sh:363: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Bdeployment.apps/nginx-deployment image updated
I0801 06:23:58.351436   58383 event.go:291] "Event occurred" object="namespace-1596262998-22509/nginx-deployment" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-deployment-6dd48b9849 to 1"
I0801 06:23:58.379416   58383 event.go:291] "Event occurred" object="namespace-1596262998-22509/nginx-deployment-6dd48b9849" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-6dd48b9849-7qrs7"
apps.sh:366: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(Bapps.sh:367: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Berror: unable to find container named "redis"
deployment.apps/nginx-deployment image updated
apps.sh:372: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:373: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Bdeployment.apps/nginx-deployment image updated
apps.sh:376: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(Bapps.sh:377: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
... skipping 13 lines ...
(Bdeployment.apps/nginx-deployment created
I0801 06:24:04.441934   58383 event.go:291] "Event occurred" object="namespace-1596262998-22509/nginx-deployment" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-deployment-b8c4df945 to 3"
I0801 06:24:04.446328   58383 event.go:291] "Event occurred" object="namespace-1596262998-22509/nginx-deployment-b8c4df945" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-b8c4df945-8wjt6"
I0801 06:24:04.453591   58383 event.go:291] "Event occurred" object="namespace-1596262998-22509/nginx-deployment-b8c4df945" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-b8c4df945-r2l8x"
I0801 06:24:04.454386   58383 event.go:291] "Event occurred" object="namespace-1596262998-22509/nginx-deployment-b8c4df945" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-b8c4df945-vxssv"
configmap/test-set-env-config created
E0801 06:24:05.207007   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
secret/test-set-env-secret created
apps.sh:400: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment:
(Bapps.sh:402: Successful get configmaps/test-set-env-config {{.metadata.name}}: test-set-env-config
(Bapps.sh:403: Successful get secret {{range.items}}{{.metadata.name}}:{{end}}: test-set-env-secret:
(Bdeployment.apps/nginx-deployment env updated
I0801 06:24:07.540683   58383 event.go:291] "Event occurred" object="namespace-1596262998-22509/nginx-deployment" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-deployment-59b7fccd97 to 1"
... skipping 16 lines ...
I0801 06:24:09.824239   58383 event.go:291] "Event occurred" object="namespace-1596262998-22509/nginx-deployment-5fbc8fbcbf" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-5fbc8fbcbf-ccrcj"
deployment.apps/nginx-deployment env updated
I0801 06:24:10.054279   58383 event.go:291] "Event occurred" object="namespace-1596262998-22509/nginx-deployment" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled down replica set nginx-deployment-b8c4df945 to 0"
I0801 06:24:10.067282   58383 event.go:291] "Event occurred" object="namespace-1596262998-22509/nginx-deployment-b8c4df945" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulDelete" message="Deleted pod: nginx-deployment-b8c4df945-vxssv"
I0801 06:24:10.080770   58383 event.go:291] "Event occurred" object="namespace-1596262998-22509/nginx-deployment" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-deployment-68d657fb6 to 1"
I0801 06:24:10.090475   58383 event.go:291] "Event occurred" object="namespace-1596262998-22509/nginx-deployment-68d657fb6" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-68d657fb6-4v9ws"
E0801 06:24:10.152027   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment env updated
I0801 06:24:10.368923   58383 event.go:291] "Event occurred" object="namespace-1596262998-22509/nginx-deployment" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled down replica set nginx-deployment-59b7fccd97 to 0"
I0801 06:24:10.377260   58383 event.go:291] "Event occurred" object="namespace-1596262998-22509/nginx-deployment-59b7fccd97" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulDelete" message="Deleted pod: nginx-deployment-59b7fccd97-ph4l5"
I0801 06:24:10.410738   58383 event.go:291] "Event occurred" object="namespace-1596262998-22509/nginx-deployment" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-deployment-7584fc66fd to 1"
I0801 06:24:10.417527   58383 event.go:291] "Event occurred" object="namespace-1596262998-22509/nginx-deployment-7584fc66fd" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-7584fc66fd-h4g7j"
deployment.apps/nginx-deployment env updated
... skipping 3 lines ...
I0801 06:24:10.769492   58383 event.go:291] "Event occurred" object="namespace-1596262998-22509/nginx-deployment-57ddd474c4" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-57ddd474c4-tc2ch"
deployment.apps/nginx-deployment env updated
I0801 06:24:10.937072   58383 event.go:291] "Event occurred" object="namespace-1596262998-22509/nginx-deployment" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled down replica set nginx-deployment-7f789d7c5f to 0"
I0801 06:24:11.013366   58383 event.go:291] "Event occurred" object="namespace-1596262998-22509/nginx-deployment" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-deployment-5f8c874568 to 1"
I0801 06:24:11.073015   58383 event.go:291] "Event occurred" object="namespace-1596262998-22509/nginx-deployment-7f789d7c5f" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulDelete" message="Deleted pod: nginx-deployment-7f789d7c5f-cb9fh"
deployment.apps "nginx-deployment" deleted
E0801 06:24:11.165951   58383 replica_set.go:532] sync "namespace-1596262998-22509/nginx-deployment-5f8c874568" failed with replicasets.apps "nginx-deployment-5f8c874568" not found
E0801 06:24:11.314831   58383 replica_set.go:532] sync "namespace-1596262998-22509/nginx-deployment-57ddd474c4" failed with Operation cannot be fulfilled on replicasets.apps "nginx-deployment-57ddd474c4": StorageError: invalid object, Code: 4, Key: /registry/replicasets/namespace-1596262998-22509/nginx-deployment-57ddd474c4, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 97caf464-90c9-4177-abd7-48734c7e2ebc, UID in object meta: 
E0801 06:24:11.362860   58383 replica_set.go:532] sync "namespace-1596262998-22509/nginx-deployment-7f789d7c5f" failed with replicasets.apps "nginx-deployment-7f789d7c5f" not found
configmap "test-set-env-config" deleted
secret "test-set-env-secret" deleted
+++ exit code: 0
Recording: run_rs_tests
Running command: run_rs_tests

... skipping 44 lines ...
Namespace:    namespace-1596263051-10211
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
Namespace:    namespace-1596263051-10211
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 18 lines ...
Namespace:    namespace-1596263051-10211
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 12 lines ...
Namespace:    namespace-1596263051-10211
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 25 lines ...
Namespace:    namespace-1596263051-10211
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
Namespace:    namespace-1596263051-10211
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
Namespace:    namespace-1596263051-10211
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 11 lines ...
Namespace:    namespace-1596263051-10211
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 149 lines ...
I0801 06:24:27.267381   58383 event.go:291] "Event occurred" object="namespace-1596263051-10211/scale-2" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set scale-2-6865bdcf4d to 3"
I0801 06:24:27.268706   58383 event.go:291] "Event occurred" object="namespace-1596263051-10211/scale-1-6865bdcf4d" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: scale-1-6865bdcf4d-tcs47"
I0801 06:24:27.281540   58383 event.go:291] "Event occurred" object="namespace-1596263051-10211/scale-2-6865bdcf4d" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: scale-2-6865bdcf4d-dn427"
I0801 06:24:27.285088   58383 event.go:291] "Event occurred" object="namespace-1596263051-10211/scale-3" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set scale-3-6865bdcf4d to 3"
I0801 06:24:27.306296   58383 event.go:291] "Event occurred" object="namespace-1596263051-10211/scale-3-6865bdcf4d" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: scale-3-6865bdcf4d-h5gf4"
I0801 06:24:27.330782   58383 event.go:291] "Event occurred" object="namespace-1596263051-10211/scale-3-6865bdcf4d" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: scale-3-6865bdcf4d-8lcpt"
E0801 06:24:27.429148   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:623: Successful get deploy scale-1 {{.spec.replicas}}: 3
(Bapps.sh:624: Successful get deploy scale-2 {{.spec.replicas}}: 3
(Bapps.sh:625: Successful get deploy scale-3 {{.spec.replicas}}: 3
(Breplicaset.apps "frontend" deleted
deployment.apps "scale-1" deleted
deployment.apps "scale-2" deleted
deployment.apps "scale-3" deleted
replicaset.apps/frontend created
I0801 06:24:29.136524   58383 event.go:291] "Event occurred" object="namespace-1596263051-10211/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-7lhq8"
I0801 06:24:29.190715   58383 event.go:291] "Event occurred" object="namespace-1596263051-10211/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-qmdkr"
I0801 06:24:29.190751   58383 event.go:291] "Event occurred" object="namespace-1596263051-10211/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-2t7jj"
apps.sh:633: Successful get rs frontend {{.spec.replicas}}: 3
(Bservice/frontend exposed
E0801 06:24:29.745210   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:637: Successful get service frontend {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
(Bservice/frontend-2 exposed
I0801 06:24:30.360267   54838 client.go:360] parsed scheme: "passthrough"
I0801 06:24:30.360340   54838 passthrough.go:48] ccResolverWrapper: sending update to cc: {[{http://127.0.0.1:2379  <nil> 0 <nil>}] <nil> <nil>}
I0801 06:24:30.360354   54838 clientconn.go:948] ClientConn switching balancer to "pick_first"
apps.sh:641: Successful get service frontend-2 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: default 80
... skipping 16 lines ...
apps.sh:661: Successful get rs frontend {{.metadata.generation}}: 5
(BSuccessful
message:kubectl-create kube-controller-manager kubectl-set
has:kubectl-set
apps.sh:669: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(Breplicaset.apps "frontend" deleted
E0801 06:24:36.829300   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:673: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:677: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicaset.apps/frontend created
I0801 06:24:37.696316   58383 event.go:291] "Event occurred" object="namespace-1596263051-10211/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-76p89"
I0801 06:24:37.708918   58383 event.go:291] "Event occurred" object="namespace-1596263051-10211/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-wshqb"
I0801 06:24:37.708970   58383 event.go:291] "Event occurred" object="namespace-1596263051-10211/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-4v8h9"
... skipping 17 lines ...
horizontalpodautoscaler.autoscaling/frontend autoscaled
apps.sh:705: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
(BSuccessful
message:kubectl-autoscale
has:kubectl-autoscale
horizontalpodautoscaler.autoscaling "frontend" deleted
Error: required flag(s) "max" not set
replicaset.apps "frontend" deleted
+++ exit code: 0
Recording: run_stateful_set_tests
Running command: run_stateful_set_tests

+++ Running case: test-cmd.run_stateful_set_tests 
... skipping 33 lines ...
apps.sh:450: Successful get controllerrevisions {{range.items}}{{.metadata.annotations}}:{{end}}: map[kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"StatefulSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-statefulset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"app":"nginx-statefulset"},"name":"nginx","namespace":"namespace-1596263089-3491"},"spec":{"replicas":0,"selector":{"matchLabels":{"app":"nginx-statefulset"}},"serviceName":"nginx","template":{"metadata":{"labels":{"app":"nginx-statefulset"}},"spec":{"containers":[{"command":["sh","-c","while true; do sleep 1; done"],"image":"k8s.gcr.io/nginx-slim:0.7","name":"nginx","ports":[{"containerPort":80,"name":"web"}]}],"terminationGracePeriodSeconds":5}},"updateStrategy":{"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-statefulset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:
(Bstatefulset.apps/nginx skipped rollback (current template already matches revision 1)
apps.sh:453: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
(Bapps.sh:454: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(Bstatefulset.apps/nginx configured
E0801 06:24:52.634168   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:457: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.8:
(Bapps.sh:458: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(Bapps.sh:459: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(Bapps.sh:460: Successful get controllerrevisions {{range.items}}{{.metadata.annotations}}:{{end}}: map[kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"StatefulSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-statefulset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"app":"nginx-statefulset"},"name":"nginx","namespace":"namespace-1596263089-3491"},"spec":{"replicas":0,"selector":{"matchLabels":{"app":"nginx-statefulset"}},"serviceName":"nginx","template":{"metadata":{"labels":{"app":"nginx-statefulset"}},"spec":{"containers":[{"command":["sh","-c","while true; do sleep 1; done"],"image":"k8s.gcr.io/nginx-slim:0.7","name":"nginx","ports":[{"containerPort":80,"name":"web"}]}],"terminationGracePeriodSeconds":5}},"updateStrategy":{"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-statefulset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:map[kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"StatefulSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-statefulset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"app":"nginx-statefulset"},"name":"nginx","namespace":"namespace-1596263089-3491"},"spec":{"replicas":0,"selector":{"matchLabels":{"app":"nginx-statefulset"}},"serviceName":"nginx","template":{"metadata":{"labels":{"app":"nginx-statefulset"}},"spec":{"containers":[{"command":["sh","-c","while true; do sleep 1; done"],"image":"k8s.gcr.io/nginx-slim:0.8","name":"nginx","ports":[{"containerPort":80,"name":"web"}]},{"image":"k8s.gcr.io/pause:2.0","name":"pause","ports":[{"containerPort":81,"name":"web-2"}]}],"terminationGracePeriodSeconds":5}},"updateStrategy":{"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-statefulset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:
... skipping 18 lines ...
(Bapps.sh:466: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(BI0801 06:24:55.659259   58383 horizontal.go:354] Horizontal Pod Autoscaler frontend has been deleted in namespace-1596263051-10211
statefulset.apps/nginx rolled back
apps.sh:469: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
(Bapps.sh:470: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(BSuccessful
message:error: unable to find specified revision 1000000 in history
has:unable to find specified revision
apps.sh:474: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
(Bapps.sh:475: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(Bstatefulset.apps/nginx rolled back
apps.sh:478: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.8:
(Bapps.sh:479: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
... skipping 8 lines ...
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_lists_tests
+++ [0801 06:24:59] Creating namespace namespace-1596263099-14402
namespace/namespace-1596263099-14402 created
Context "test" modified.
+++ [0801 06:24:59] Testing kubectl(v1:lists)
E0801 06:24:59.749526   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/list-service-test created
deployment.apps/list-deployment-test created
I0801 06:25:00.237452   58383 event.go:291] "Event occurred" object="namespace-1596263099-14402/list-deployment-test" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set list-deployment-test-6f68b66b69 to 1"
I0801 06:25:00.247410   58383 event.go:291] "Event occurred" object="namespace-1596263099-14402/list-deployment-test-6f68b66b69" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: list-deployment-test-6f68b66b69-tbp46"
I0801 06:25:00.433642   54838 client.go:360] parsed scheme: "passthrough"
I0801 06:25:00.433711   54838 passthrough.go:48] ccResolverWrapper: sending update to cc: {[{http://127.0.0.1:2379  <nil> 0 <nil>}] <nil> <nil>}
... skipping 41 lines ...
Name:         mock
Namespace:    namespace-1596263100-5481
Selector:     app=mock
Labels:       app=mock
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 11 lines ...
replicationcontroller/mock replaced
I0801 06:25:04.891209   58383 event.go:291] "Event occurred" object="namespace-1596263100-5481/mock" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: mock-j8fdx"
generic-resources.sh:96: Successful get services mock {{.metadata.labels.status}}: replaced
(Bgeneric-resources.sh:102: Successful get rc mock {{.metadata.labels.status}}: replaced
(Bservice/mock edited
replicationcontroller/mock edited
E0801 06:25:06.426860   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:114: Successful get services mock {{.metadata.labels.status}}: edited
(Bgeneric-resources.sh:120: Successful get rc mock {{.metadata.labels.status}}: edited
(Bservice/mock labeled
replicationcontroller/mock labeled
generic-resources.sh:134: Successful get services mock {{.metadata.labels.labeled}}: true
(Bgeneric-resources.sh:140: Successful get rc mock {{.metadata.labels.labeled}}: true
... skipping 13 lines ...
(Bgeneric-resources.sh:80: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:
(BNAME           TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
service/mock   ClusterIP   10.0.0.55    <none>        99/TCP    1s

NAME                         DESIRED   CURRENT   READY   AGE
replicationcontroller/mock   1         1         0       1s
E0801 06:25:10.809375   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Name:              mock
Namespace:         namespace-1596263100-5481
Labels:            app=mock
Annotations:       <none>
Selector:          app=mock
Type:              ClusterIP
... skipping 8 lines ...
Name:         mock
Namespace:    namespace-1596263100-5481
Selector:     app=mock
Labels:       app=mock
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 56 lines ...
Name:         mock
Namespace:    namespace-1596263100-5481
Selector:     app=mock
Labels:       app=mock
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 13 lines ...
generic-resources.sh:96: Successful get services mock {{.metadata.labels.status}}: replaced
(Bgeneric-resources.sh:102: Successful get rc mock {{.metadata.labels.status}}: replaced
(Bservice/mock edited
replicationcontroller/mock edited
generic-resources.sh:114: Successful get services mock {{.metadata.labels.status}}: edited
(Bgeneric-resources.sh:120: Successful get rc mock {{.metadata.labels.status}}: edited
(BE0801 06:25:20.988832   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock labeled
replicationcontroller/mock labeled
generic-resources.sh:134: Successful get services mock {{.metadata.labels.labeled}}: true
(Bgeneric-resources.sh:140: Successful get rc mock {{.metadata.labels.labeled}}: true
(Bservice/mock annotated
replicationcontroller/mock annotated
... skipping 16 lines ...
Namespace:    namespace-1596263100-5481
Selector:     app=mock
Labels:       app=mock
              status=replaced
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 11 lines ...
Namespace:    namespace-1596263100-5481
Selector:     app=mock2
Labels:       app=mock2
              status=replaced
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock2
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 81 lines ...
generic-resources.sh:153: Successful get services mock {{.metadata.annotations.annotated}}: true
(Bgeneric-resources.sh:155: Successful get services mock2 {{.metadata.annotations.annotated}}: true
(Bservice "mock" deleted
service "mock2" deleted
generic-resources.sh:173: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(Bgeneric-resources.sh:174: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0801 06:25:37.670549   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock created
replicationcontroller/mock created
I0801 06:25:38.260543   58383 event.go:291] "Event occurred" object="namespace-1596263100-5481/mock" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: mock-lns2f"
generic-resources.sh:180: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:
(Bgeneric-resources.sh:181: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:
(Bservice "mock" deleted
... skipping 23 lines ...
(Bpersistentvolume "pv0002" deleted
persistentvolume/pv0003 created
storage.sh:39: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0003:
(Bpersistentvolume "pv0003" deleted
storage.sh:42: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpersistentvolume/pv0001 created
E0801 06:25:45.947642   58383 pv_protection_controller.go:118] PV pv0001 failed with : Operation cannot be fulfilled on persistentvolumes "pv0001": the object has been modified; please apply your changes to the latest version and try again
E0801 06:25:45.960699   58383 pv_protection_controller.go:118] PV pv0001 failed with : Operation cannot be fulfilled on persistentvolumes "pv0001": the object has been modified; please apply your changes to the latest version and try again
storage.sh:45: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0001:
(BSuccessful
message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
persistentvolume "pv0001" deleted
has:warning: deleting cluster-scoped resources
Successful
... skipping 37 lines ...
Running command: run_storage_class_tests

+++ Running case: test-cmd.run_storage_class_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_storage_class_tests
+++ [0801 06:25:52] Testing storage class
E0801 06:25:52.354271   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:92: Successful get storageclass {{range.items}}{{.metadata.name}}:{{end}}: 
(Bstorageclass.storage.k8s.io/storage-class-name created
storage.sh:108: Successful get storageclass {{range.items}}{{.metadata.name}}:{{end}}: storage-class-name:
(Bstorage.sh:109: Successful get sc {{range.items}}{{.metadata.name}}:{{end}}: storage-class-name:
(Bstorageclass.storage.k8s.io "storage-class-name" deleted
storage.sh:112: Successful get storageclass {{range.items}}{{.metadata.name}}:{{end}}: 
... skipping 18 lines ...
Roles:              <none>
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
CreationTimestamp:  Sat, 01 Aug 2020 06:10:47 +0000
Taints:             node.kubernetes.io/unreachable:NoSchedule
Unschedulable:      false
Lease:              Failed to get lease: leases.coordination.k8s.io "127.0.0.1" not found
Conditions:
  Type             Status    LastHeartbeatTime                 LastTransitionTime                Reason                   Message
  ----             ------    -----------------                 ------------------                ------                   -------
  Ready            Unknown   Sat, 01 Aug 2020 06:10:47 +0000   Sat, 01 Aug 2020 06:11:51 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  MemoryPressure   Unknown   Sat, 01 Aug 2020 06:10:47 +0000   Sat, 01 Aug 2020 06:11:51 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  DiskPressure     Unknown   Sat, 01 Aug 2020 06:10:47 +0000   Sat, 01 Aug 2020 06:11:51 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
... skipping 30 lines ...
Roles:              <none>
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
CreationTimestamp:  Sat, 01 Aug 2020 06:10:47 +0000
Taints:             node.kubernetes.io/unreachable:NoSchedule
Unschedulable:      false
Lease:              Failed to get lease: leases.coordination.k8s.io "127.0.0.1" not found
Conditions:
  Type             Status    LastHeartbeatTime                 LastTransitionTime                Reason                   Message
  ----             ------    -----------------                 ------------------                ------                   -------
  Ready            Unknown   Sat, 01 Aug 2020 06:10:47 +0000   Sat, 01 Aug 2020 06:11:51 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  MemoryPressure   Unknown   Sat, 01 Aug 2020 06:10:47 +0000   Sat, 01 Aug 2020 06:11:51 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  DiskPressure     Unknown   Sat, 01 Aug 2020 06:10:47 +0000   Sat, 01 Aug 2020 06:11:51 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
... skipping 31 lines ...
Roles:              <none>
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
CreationTimestamp:  Sat, 01 Aug 2020 06:10:47 +0000
Taints:             node.kubernetes.io/unreachable:NoSchedule
Unschedulable:      false
Lease:              Failed to get lease: leases.coordination.k8s.io "127.0.0.1" not found
Conditions:
  Type             Status    LastHeartbeatTime                 LastTransitionTime                Reason                   Message
  ----             ------    -----------------                 ------------------                ------                   -------
  Ready            Unknown   Sat, 01 Aug 2020 06:10:47 +0000   Sat, 01 Aug 2020 06:11:51 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  MemoryPressure   Unknown   Sat, 01 Aug 2020 06:10:47 +0000   Sat, 01 Aug 2020 06:11:51 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  DiskPressure     Unknown   Sat, 01 Aug 2020 06:10:47 +0000   Sat, 01 Aug 2020 06:11:51 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
... skipping 30 lines ...
Roles:              <none>
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
CreationTimestamp:  Sat, 01 Aug 2020 06:10:47 +0000
Taints:             node.kubernetes.io/unreachable:NoSchedule
Unschedulable:      false
Lease:              Failed to get lease: leases.coordination.k8s.io "127.0.0.1" not found
Conditions:
  Type             Status    LastHeartbeatTime                 LastTransitionTime                Reason                   Message
  ----             ------    -----------------                 ------------------                ------                   -------
  Ready            Unknown   Sat, 01 Aug 2020 06:10:47 +0000   Sat, 01 Aug 2020 06:11:51 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  MemoryPressure   Unknown   Sat, 01 Aug 2020 06:10:47 +0000   Sat, 01 Aug 2020 06:11:51 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  DiskPressure     Unknown   Sat, 01 Aug 2020 06:10:47 +0000   Sat, 01 Aug 2020 06:11:51 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
... skipping 38 lines ...
Roles:              <none>
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
CreationTimestamp:  Sat, 01 Aug 2020 06:10:47 +0000
Taints:             node.kubernetes.io/unreachable:NoSchedule
Unschedulable:      false
Lease:              Failed to get lease: leases.coordination.k8s.io "127.0.0.1" not found
Conditions:
  Type             Status    LastHeartbeatTime                 LastTransitionTime                Reason                   Message
  ----             ------    -----------------                 ------------------                ------                   -------
  Ready            Unknown   Sat, 01 Aug 2020 06:10:47 +0000   Sat, 01 Aug 2020 06:11:51 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  MemoryPressure   Unknown   Sat, 01 Aug 2020 06:10:47 +0000   Sat, 01 Aug 2020 06:11:51 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  DiskPressure     Unknown   Sat, 01 Aug 2020 06:10:47 +0000   Sat, 01 Aug 2020 06:11:51 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
... skipping 30 lines ...
Roles:              <none>
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
CreationTimestamp:  Sat, 01 Aug 2020 06:10:47 +0000
Taints:             node.kubernetes.io/unreachable:NoSchedule
Unschedulable:      false
Lease:              Failed to get lease: leases.coordination.k8s.io "127.0.0.1" not found
Conditions:
  Type             Status    LastHeartbeatTime                 LastTransitionTime                Reason                   Message
  ----             ------    -----------------                 ------------------                ------                   -------
  Ready            Unknown   Sat, 01 Aug 2020 06:10:47 +0000   Sat, 01 Aug 2020 06:11:51 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  MemoryPressure   Unknown   Sat, 01 Aug 2020 06:10:47 +0000   Sat, 01 Aug 2020 06:11:51 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  DiskPressure     Unknown   Sat, 01 Aug 2020 06:10:47 +0000   Sat, 01 Aug 2020 06:11:51 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
... skipping 30 lines ...
Roles:              <none>
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
CreationTimestamp:  Sat, 01 Aug 2020 06:10:47 +0000
Taints:             node.kubernetes.io/unreachable:NoSchedule
Unschedulable:      false
Lease:              Failed to get lease: leases.coordination.k8s.io "127.0.0.1" not found
Conditions:
  Type             Status    LastHeartbeatTime                 LastTransitionTime                Reason                   Message
  ----             ------    -----------------                 ------------------                ------                   -------
  Ready            Unknown   Sat, 01 Aug 2020 06:10:47 +0000   Sat, 01 Aug 2020 06:11:51 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  MemoryPressure   Unknown   Sat, 01 Aug 2020 06:10:47 +0000   Sat, 01 Aug 2020 06:11:51 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  DiskPressure     Unknown   Sat, 01 Aug 2020 06:10:47 +0000   Sat, 01 Aug 2020 06:11:51 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
... skipping 29 lines ...
Roles:              <none>
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
CreationTimestamp:  Sat, 01 Aug 2020 06:10:47 +0000
Taints:             node.kubernetes.io/unreachable:NoSchedule
Unschedulable:      false
Lease:              Failed to get lease: leases.coordination.k8s.io "127.0.0.1" not found
Conditions:
  Type             Status    LastHeartbeatTime                 LastTransitionTime                Reason                   Message
  ----             ------    -----------------                 ------------------                ------                   -------
  Ready            Unknown   Sat, 01 Aug 2020 06:10:47 +0000   Sat, 01 Aug 2020 06:11:51 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  MemoryPressure   Unknown   Sat, 01 Aug 2020 06:10:47 +0000   Sat, 01 Aug 2020 06:11:51 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  DiskPressure     Unknown   Sat, 01 Aug 2020 06:10:47 +0000   Sat, 01 Aug 2020 06:11:51 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
... skipping 22 lines ...
  Resource           Requests  Limits
  --------           --------  ------
  cpu                0 (0%)    0 (0%)
  memory             0 (0%)    0 (0%)
  ephemeral-storage  0 (0%)    0 (0%)
Events:              <none>
(BE0801 06:25:57.778437   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1555: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode/127.0.0.1 patched
core.sh:1558: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: true
(Bnode/127.0.0.1 patched
core.sh:1561: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(BWarning: authentication.k8s.io/v1beta1 TokenReview is deprecated in v1.19+, unavailable in v1.22+; use authentication.k8s.io/v1 TokenReview
... skipping 93 lines ...
  }
}
+++ exit code: 0
Successful
message:yes
has:yes
E0801 06:26:02.770824   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:yes
has:yes
Successful
message:Warning: the server doesn't have a resource type 'invalid_resource'
yes
has:the server doesn't have a resource type
Successful
message:yes
has:yes
Successful
message:error: --subresource can not be used with NonResourceURL
has:subresource can not be used with NonResourceURL
Successful
Successful
message:yes
0
has:0
... skipping 56 lines ...
role.rbac.authorization.k8s.io/testing-R reconciled
	reconciliation required create
	missing rules added:
		{Verbs:[get list watch] APIGroups:[] Resources:[configmaps] ResourceNames:[] NonResourceURLs:[]}
legacy-script.sh:840: Successful get rolebindings -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-RB:
(Blegacy-script.sh:841: Successful get roles -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-R:
(BE0801 06:26:09.077857   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
legacy-script.sh:842: Successful get clusterrolebindings -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-CRB:
(Blegacy-script.sh:843: Successful get clusterroles -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-CR:
(BSuccessful
message:error: only rbac.authorization.k8s.io/v1 is supported: not *v1beta1.ClusterRole
has:only rbac.authorization.k8s.io/v1 is supported
rolebinding.rbac.authorization.k8s.io "testing-RB" deleted
role.rbac.authorization.k8s.io "testing-R" deleted
warning: deleting cluster-scoped resources, not scoped to the provided namespace
clusterrole.rbac.authorization.k8s.io "testing-CR" deleted
clusterrolebinding.rbac.authorization.k8s.io "testing-CRB" deleted
... skipping 24 lines ...
discovery.sh:91: Successful get all -l'app=cassandra' {{range.items}}{{range .metadata.labels}}{{.}}:{{end}}{{end}}: cassandra:cassandra:cassandra:cassandra:
(BI0801 06:26:13.678665   58383 event.go:291] "Event occurred" object="namespace-1596263171-30909/cassandra" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: cassandra-8w4xr"
pod "cassandra-sxrcf" deleted
I0801 06:26:13.713345   58383 event.go:291] "Event occurred" object="namespace-1596263171-30909/cassandra" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: cassandra-mkgqs"
pod "cassandra-zznnn" deleted
replicationcontroller "cassandra" deleted
E0801 06:26:13.748666   58383 replica_set.go:532] sync "namespace-1596263171-30909/cassandra" failed with replicationcontrollers "cassandra" not found
service "cassandra" deleted
+++ exit code: 0
Recording: run_kubectl_explain_tests
Running command: run_kubectl_explain_tests

+++ Running case: test-cmd.run_kubectl_explain_tests 
... skipping 228 lines ...
get.sh:350: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BNAMESPACE                    NAME        READY   STATUS    RESTARTS   AGE
namespace-1596263171-30909   valid-pod   0/1     Pending   0          1s
namespace/all-ns-test-1 created
serviceaccount/test created
namespace/all-ns-test-2 created
E0801 06:26:31.335178   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
serviceaccount/test created
Successful
message:NAMESPACE                    NAME      SECRETS   AGE
all-ns-test-1                default   0         1s
all-ns-test-1                test      0         1s
all-ns-test-2                default   0         0s
... skipping 115 lines ...
namespace-1596263147-23463   default   0         45s
namespace-1596263171-30909   default   0         21s
some-other-random            default   0         24s
has:all-ns-test-2
namespace "all-ns-test-1" deleted
namespace "all-ns-test-2" deleted
E0801 06:26:41.165781   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0801 06:26:42.489862   58383 namespace_controller.go:185] Namespace has been deleted all-ns-test-1
get.sh:376: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "valid-pod" force deleted
get.sh:380: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bget.sh:384: Successful get nodes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:
... skipping 209 lines ...
Successful
message:foo:
has:foo:
Successful
message:foo:
has:foo:
E0801 06:26:53.202911   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:foo:
has:foo:
Successful
message:valid-pod:
has:valid-pod:
Successful
message:valid-pod:
has:valid-pod:
Successful
message:valid-pod:
has:valid-pod:
E0801 06:26:54.065376   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:kubernetes:
has:kubernetes:
E0801 06:26:54.321585   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:valid-pod:
has:valid-pod:
Successful
message:foo:
has:foo:
... skipping 488 lines ...
+++ command: run_cluster_management_tests
+++ [0801 06:27:07] Creating namespace namespace-1596263227-8492
namespace/namespace-1596263227-8492 created
Context "test" modified.
+++ [0801 06:27:07] Testing cluster-management commands
node-management.sh:28: Successful get nodes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:
(BE0801 06:27:08.349891   58383 reflector.go:127] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/test-pod-1 created
pod/test-pod-2 created
node-management.sh:77: Successful get nodes 127.0.0.1 {{range .spec.taints}}{{if eq .key \"dedicated\"}}{{.key}}={{.value}}:{{.effect}}{{end}}{{end}}: 
(Bnode/127.0.0.1 tainted
node/127.0.0.1 tainted
node-management.sh:81: Successful get nodes 127.0.0.1 {{range .spec.taints}}{{if eq .key \"dedicated\"}}{{.key}}={{.value}}:{{.effect}}{{end}}{{end}}: 
... skipping 42 lines ...
message:node/127.0.0.1 already uncordoned (server dry run)
has:already uncordoned
node-management.sh:145: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode/127.0.0.1 labeled
node-management.sh:150: Successful get nodes 127.0.0.1 {{.metadata.labels.test}}: label
(BSuccessful
message:error: cannot specify both a node name and a --selector option
See 'kubectl drain -h' for help and examples
has:cannot specify both a node name
Successful
message:error: USAGE: cordon NODE [flags]
See 'kubectl cordon -h' for help and examples
has:error\: USAGE\: cordon NODE
node/127.0.0.1 already uncordoned
Successful
message:error: You must provide one or more resources by argument or filename.
Example resource specifications include:
   '-f rsrc.yaml'
   '--filename=rsrc.json'
   '<resource> <name>'
   '<resource>'
has:must provide one or more resources
... skipping 14 lines ...
+++ [0801 06:27:21] Testing kubectl plugins
Successful
message:The following compatible plugins are available:

test/fixtures/pkg/kubectl/plugins/version/kubectl-version
  - warning: kubectl-version overwrites existing command: "kubectl version"
error: one plugin warning was found
has:kubectl-version overwrites existing command: "kubectl version"
Successful
message:The following compatible plugins are available:

test/fixtures/pkg/kubectl/plugins/kubectl-foo
test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo
  - warning: test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo is overshadowed by a similarly named plugin: test/fixtures/pkg/kubectl/plugins/kubectl-foo
error: one plugin warning was found
has:test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo is overshadowed by a similarly named plugin
Successful
message:The following compatible plugins are available:

test/fixtures/pkg/kubectl/plugins/kubectl-foo
has:plugins are available
Successful
message:Unable read directory "test/fixtures/pkg/kubectl/plugins/empty" from your PATH: open test/fixtures/pkg/kubectl/plugins/empty: no such file or directory. Skipping...
error: unable to find any kubectl plugins in your PATH
has:unable to find any kubectl plugins in your PATH
Successful
message:I am plugin foo
has:plugin foo
Successful
message:I am plugin bar called with args test/fixtures/pkg/kubectl/plugins/bar/kubectl-bar arg1
... skipping 10 lines ...

+++ Running case: test-cmd.run_impersonation_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_impersonation_tests
+++ [0801 06:27:23] Testing impersonation
Successful
message:error: requesting groups or user-extra for  without impersonating a user
has:without impersonating a user
Warning: certificates.k8s.io/v1beta1 CertificateSigningRequest is deprecated in v1.19+, unavailable in v1.22+; use certificates.k8s.io/v1 CertificateSigningRequest
certificatesigningrequest.certificates.k8s.io/foo created
authorization.sh:68: Successful get csr/foo {{.spec.username}}: user1
(Bauthorization.sh:69: Successful get csr/foo {{range .spec.groups}}{{.}}{{end}}: system:authenticated
(BWarning: certificates.k8s.io/v1beta1 CertificateSigningRequest is deprecated in v1.19+, unavailable in v1.22+; use certificates.k8s.io/v1 CertificateSigningRequest
... skipping 57 lines ...
I0801 06:27:30.089321   54838 secure_serving.go:241] Stopped listening on 127.0.0.1:6443
I0801 06:27:30.089327   54838 controller.go:123] Shutting down OpenAPI controller
I0801 06:27:30.089552   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.089606   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.089716   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.089720   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0801 06:27:30.089958   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0801 06:27:30.089967   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0801 06:27:30.089997   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0801 06:27:30.090003   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.090058   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.090108   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.090129   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0801 06:27:30.090152   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0801 06:27:30.090156   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0801 06:27:30.090156   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0801 06:27:30.090248   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.090260   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.090239   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.090269   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.090332   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.090363   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.090365   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.090470   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.090480   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.090487   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.090569   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.090582   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0801 06:27:30.090626   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0801 06:27:30.090639   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.090664   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.090680   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0801 06:27:30.090689   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0801 06:27:30.090719   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.090757   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.090797   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.090822   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.090826   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.090860   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.090881   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.090937   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.090961   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.091008   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.091041   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.091054   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.091112   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0801 06:27:30.091155   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0801 06:27:30.091258   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.091368   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.091489   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.091588   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.091671   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.091820   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.091910   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.091999   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0801 06:27:30.092462   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.092505   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.092547   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.092577   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.092617   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.092647   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.092652   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.092681   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.092683   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.092712   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.092725   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.092742   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.092765   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.092770   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.092797   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.092802   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.092823   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.092837   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.092842   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.092852   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.092869   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.092876   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.092887   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.092904   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.092905   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0801 06:27:30.092911   54838 apiservice_controller.go:131] Shutting down APIServiceRegistrationController
I0801 06:27:30.088812   54838 crdregistration_controller.go:142] Shutting down crd-autoregister controller
W0801 06:27:30.092936   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.092945   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.092965   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.092977   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.093000   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.093020   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.093052   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.093054   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.093080   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.093151   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0801 06:27:30.093190   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.093210   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0801 06:27:30.093200   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.093239   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.093274   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.093281   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.093308   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0801 06:27:30.093316   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0801 06:27:30.093344   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0801 06:27:30.093343   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0801 06:27:30.093374   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.093406   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0801 06:27:30.093424   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.093437   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0801 06:27:30.093488   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0801 06:27:30.093532   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0801 06:27:30.093534   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0801 06:27:30.093534   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.093532   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.093626   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.093648   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.093656   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0801 06:27:30.093691   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0801 06:27:30.093713   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.093742   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.093823   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.093839   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.093917   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.093941   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0801 06:27:30.094010   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0801 06:27:30.094060   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.094097   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.094101   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0801 06:27:30.094110   54838 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0801 06:27:30.094125   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.094151   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.094181   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.094220   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.094265   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.094296   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.094336   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.094366   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.094365   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.094401   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.094412   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.094488   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.094532   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.094535   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.094556   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:30.094576   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
junit report dir: /logs/artifacts
+++ [0801 06:27:30] Clean up complete
+ make test-integration
W0801 06:27:31.095065   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.095065   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.095120   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.095150   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.095072   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.095069   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.095072   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.095209   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.095246   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.095272   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.095321   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.095329   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.095326   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.095486   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.095486   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.095572   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.095577   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.096533   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.096541   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.096558   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.096597   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.097024   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.096678   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.096697   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.096713   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.097102   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.097156   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.097208   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.097230   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.097337   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.097351   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.097389   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.097415   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.097472   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.097494   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.097523   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.097543   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.097553   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.097610   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.097622   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.097668   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.097675   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.097712   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.097775   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.097784   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.097805   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.097821   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.097853   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.097937   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.097949   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.097997   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.098016   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.098151   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.098155   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.098214   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.098215   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.098331   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.098371   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.098405   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.098423   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.098434   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.098501   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.098553   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.098566   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.098613   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.098613   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.098613   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.098663   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.098686   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.098697   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.098708   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:31.098728   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.378146   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.388263   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.396853   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.406342   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.424420   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.429124   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.432640   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.453170   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.456861   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.456886   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.457360   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.473415   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.491700   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.493262   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.494965   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.512686   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.520627   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.524732   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.533264   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.568612   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.577396   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.585398   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.606647   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.609811   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.620787   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.635283   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.637843   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.647423   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.654913   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.693748   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.700191   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.709576   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.723046   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.723120   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.724851   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.735512   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.737763   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.738957   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.739246   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.741739   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.748630   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.761893   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.762152   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.767456   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.769785   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.774206   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.784703   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.792542   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.813726   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.814304   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.824891   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.828313   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.840248   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.845078   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.848182   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.886587   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.887194   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.892122   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.892124   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.904739   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.910490   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.916494   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.924261   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.948109   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.952508   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.953527   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.971022   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.979805   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:32.993479   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:33.004629   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:33.008036   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:33.016626   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:34.493482   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:34.612108   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:34.692903   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:34.754382   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:34.765554   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:34.773327   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:34.777460   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:34.794373   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:34.885597   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:34.966221   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:34.976444   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:34.988923   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:34.989846   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.016660   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.041761   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.044501   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.062238   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.076934   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.089623   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.090860   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.105670   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.108080   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.122067   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.124790   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.126878   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.146647   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.153245   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.156623   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.174819   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.184723   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.186864   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.193287   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.195712   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.196544   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.215755   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.234505   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.238645   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.249322   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.276144   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.288842   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.288844   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.298043   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.304512   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.304594   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.305882   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.327066   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.332506   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.349350   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.359347   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.389350   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.405688   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.416475   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.465471   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.474828   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.497067   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.508084   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.512492   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.522218   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.525555   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.538873   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.609076   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.641501   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.645585   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.661128   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.731100   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.741006   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.748248   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.765484   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.793250   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.875420   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:35.944826   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:36.004635   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:38.184012   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:38.196836   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:38.490044   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:38.548883   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:38.581679   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:38.680212   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:38.757620   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:38.767244   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:38.781137   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:38.808680   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:38.820728   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:38.870650   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:38.912877   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:38.927288   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.005183   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.019281   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.050517   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.053777   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.116818   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.155896   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.190680   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.210566   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.281608   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.284802   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.303142   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.350046   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.354247   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.378019   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.389681   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0801 06:27:39.391204   54838 client.go:360] parsed scheme: "passthrough"
I0801 06:27:39.391262   54838 passthrough.go:48] ccResolverWrapper: sending update to cc: {[{http://127.0.0.1:2379  <nil> 0 <nil>}] <nil> <nil>}
I0801 06:27:39.391277   54838 clientconn.go:948] ClientConn switching balancer to "pick_first"
W0801 06:27:39.392513   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.400947   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.423224   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.423271   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.426373   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.446379   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.460194   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.466399   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.482061   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.501398   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.511107   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.550619   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.559368   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.579062   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.579166   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.586818   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.589383   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.596909   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.603005   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.660430   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.673661   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.674176   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.700697   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.715354   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.736758   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.752423   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.816710   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.823352   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.825761   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.837334   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.846624   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.918078   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.922608   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.945013   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.955249   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.962803   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.969348   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:39.974457   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:40.024203   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:40.038868   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:40.228622   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:40.234004   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:40.246100   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:40.392990   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:40.486667   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:41.898237   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:44.054732   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:44.170387   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:44.318016   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:44.385186   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:44.464947   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:44.560105   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:44.577078   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:44.682283   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:44.796374   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:44.882935   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:44.893878   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:44.960516   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:44.994507   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:45.009894   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:45.042904   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:45.052405   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:45.141044   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:45.185907   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:45.212505   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:45.217322   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:45.248494   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:45.294319   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:45.412182   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:45.458106   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:45.600436   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:45.677037   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:45.677164   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:45.700295   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:45.713964   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:45.749907   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:45.790117   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:45.879066   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:45.890141   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:45.924494   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:45.937293   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:45.996929   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:46.041651   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:46.071324   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:46.089191   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:46.275982   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:46.288857   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:46.323367   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:46.345104   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:46.395141   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:46.397979   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:46.456546   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:46.463960   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:46.520443   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:46.538332   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:46.581889   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:46.617987   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:46.715405   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:46.746409   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:46.754729   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:46.760682   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:46.787214   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:46.839038   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:46.845780   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:46.929006   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:46.969571   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:47.014504   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:47.023223   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:47.043071   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:47.054433   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:47.075177   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:47.204622   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:47.244761   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:47.288319   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:47.343181   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:47.444872   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:47.590250   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:47.736334   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:47.834683   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:48.272437   54838 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0801 06:27:50.088780   54838 controller.go:193] RemoveEndpoints() timed out
+++ [0801 06:27:58] Checking etcd is on PATH
/home/prow/go/src/k8s.io/kubernetes/third_party/etcd/etcd
+++ [0801 06:27:58] Starting etcd instance
etcd --advertise-client-urls http://127.0.0.1:2379 --data-dir /tmp/tmp.OsWfrP5pQc --listen-client-urls http://127.0.0.1:2379 --log-level=debug > "/logs/artifacts/etcd.5ba4656e-d3ba-11ea-a66b-6ed196e94398.root.log.DEBUG.20200801-062758.92742" 2>/dev/null
Waiting for etcd to come up.
... skipping 9 lines ...
{"Time":"2020-08-01T06:33:10.316505875Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver","Test":"TestListOptions/watchCacheEnabled=true/limit=0_continue=empty_rv=invalid_rvMatch=","Output":"pkg/endpoints/filters/impersonation.go:50 +0x203d\\nnet/http.HandlerFunc.ServeHTTP(0xc01e521e40, 0x7f35a4204a80, 0xc026c1ee30, 0xc0279cce00)\\n\\t/usr/local/go/src/net/http/server.go:2042 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithAuthentication.func1(0x7f35a4204a80, 0xc026c1ee30, 0xc0279ccd00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authentication.go:70 +0x672\\nnet/http.HandlerFunc.ServeHTTP(0xc002c1ea00, 0x7f35a4204a80, 0xc026c1ee30, 0xc0279ccd00)\\n\\t/usr/local/go/src/net/http/server.go:2042 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1(0xc0279809c0, 0xc01d83b9e0, 0x5174800, 0xc026c1ee30, 0xc0279ccd00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:113 +0xb8\\ncreated by k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP\\"}
{"Time":"2020-08-01T06:33:10.323254433Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver","Test":"TestListOptions/watchCacheEnabled=true/limit=0_continue=empty_rv=invalid_rvMatch=NotOlderThan","Output":"/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:228 +0xb2\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.(*ResponseWriterDelegator).WriteHeader(0xc027abefc0, 0x1f4)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:503 +0x45\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.(*deferredResponseWriter).Write(0xc0278ad6d0, 0xc027ab2000, 0xc0, 0x4c3f, 0x0, 0x0, 0xc026aa0170)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:202 +0x1f7\\nencoding/json.(*Encoder).Encode(0xc026aa0190, 0x4856480, 0xc027af2000, 0x0, 0x41147b)\\n\\t/usr/local/go/src/encoding/json/stream.go:231 +0x1cb\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).doEncode(0xc000112c30, 0x511a760, 0xc027af2000, 0x5109380, 0xc0278ad6d0, 0x0, 0x0)\\n\\t/home/prow/go/src/"}
{"Time":"2020-08-01T06:33:10.323283909Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver","Test":"TestListOptions/watchCacheEnabled=true/limit=0_continue=empty_rv=invalid_rvMatch=NotOlderThan","Output":"k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:326 +0x2e9\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).Encode(0xc000112c30, 0x511a760, 0xc027af2000, 0x5109380, 0xc0278ad6d0, 0x3a5bdf4, 0x6)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:300 +0x169\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).doEncode(0xc027af20a0, 0x511a760, 0xc027af2000, 0x5109380, 0xc0278ad6d0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:228 +0x396\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).Encode(0xc027af20a0, 0x511a760, 0xc027af2000, 0x5109380, 0xc0278ad6d0, 0xc000112c30, 0x5122de0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_outpu"}
{"Time":"2020-08-01T06:33:10.323336437Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver","Test":"TestListOptions/watchCacheEnabled=true/limit=0_continue=empty_rv=invalid_rvMatch=NotOlderThan","Output":"t/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:184 +0x170\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.SerializeObject(0x4918617, 0x10, 0x7f35a4422f40, 0xc027af20a0, 0x516dd00, 0xc027a8a308, 0xc027aea700, 0x1f4, 0x511a760, 0xc027af2000)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:96 +0x12c\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.WriteObjectNegotiated(0x5171280, 0xc01e7d74a0, 0x51715c0, 0x732b668, 0x48fba30, 0x4, 0x48fa2e6, 0x2, 0x516dd00, 0xc027a8a308, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:251 +0x572\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.ErrorNegotiated(0x510d6c0, 0xc027ad6b40, 0x5171280, 0xc01e7d74a0, 0x4"}
{"Time":"2020-08-01T06:33:10.324104113Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver","Test":"TestListOptions/watchCacheEnabled=true/limit=0_continue=empty_rv=invalid_rvMatch=NotOlderThan","Output":"/metrics.InstrumentRouteFunc.func1(0xc027abef30, 0xc027ab0b60)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:384 +0x282\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc0190395f0, 0x7f35a4204a80, 0xc027a8a2f8, 0xc027aea700)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:288 +0xa84\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).Dispatch(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:199\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.director.ServeHTTP(0x4911e1e, 0xe, 0xc0190395f0, 0xc00049ce70, 0x7f35a4204a80, 0xc027a8a2f8, 0xc027aea700)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/handler.go:146 +0x539\\nk8s.io/kubernetes/vendor"}
{"Time":"2020-08-01T06:33:10.324198674Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver","Test":"TestListOptions/watchCacheEnabled=true/limit=0_continue=empty_rv=invalid_rvMatch=NotOlderThan","Output":"ernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/impersonation.go:50 +0x203d\\nnet/http.HandlerFunc.ServeHTTP(0xc01e521e40, 0x7f35a4204a80, 0xc027a8a2f8, 0xc027aea700)\\n\\t/usr/local/go/src/net/http/server.go:2042 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithAuthentication.func1(0x7f35a4204a80, 0xc027a8a2f8, 0xc027aea600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authentication.go:70 +0x672\\nnet/http.HandlerFunc.ServeHTTP(0xc002c1ea00, 0x7f35a4204a80, 0xc027a8a2f8, 0xc027aea600)\\n\\t/usr/local/go/src/net/http/server.go:2042 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1(0xc02792bec0, 0xc01d83b9e0, 0x5174800, 0xc027a8a2f8, 0xc027aea600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:113 +0xb8\\ncreated by k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filt"}
{"Time":"2020-08-01T06:33:11.365572493Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestSelfSubjectAccessReview","Output":"meout.go:228 +0xb2\\nnet/http.Error(0x7f30a273f8e8, 0xc008a9b558, 0xc008f59bc0, 0x60, 0x1f4)\\n\\t/usr/local/go/src/net/http/server.go:2054 +0x1f6\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.InternalError(0x7f30a273f8e8, 0xc008a9b558, 0xc009193200, 0x5183420, 0xc0091b0340)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/errors.go:75 +0x11e\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithAuthorization.func1(0x7f30a273f8e8, 0xc008a9b558, 0xc009193200)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authorization.go:69 +0x497\\nnet/http.HandlerFunc.ServeHTTP(0xc00072b0c0, 0x7f30a273f8e8, 0xc008a9b558, 0xc009193200)\\n\\t/usr/local/go/src/net/http/server.go:2042 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithMaxInFlightLimit.func2(0x7f30a273f8e8, 0xc008a9b558, 0xc0091932"}
{"Time":"2020-08-01T06:33:11.365588386Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestSelfSubjectAccessReview","Output":"00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/maxinflight.go:175 +0x4cf\\nnet/http.HandlerFunc.ServeHTTP(0xc00477efc0, 0x7f30a273f8e8, 0xc008a9b558, 0xc009193200)\\n\\t/usr/local/go/src/net/http/server.go:2042 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithImpersonation.func1(0x7f30a273f8e8, 0xc008a9b558, 0xc009193200)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/impersonation.go:50 +0x203d\\nnet/http.HandlerFunc.ServeHTTP(0xc00072b140, 0x7f30a273f8e8, 0xc008a9b558, 0xc009193200)\\n\\t/usr/local/go/src/net/http/server.go:2042 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithAuthentication.func1(0x7f30a273f8e8, 0xc008a9b558, 0xc009193100)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authentication.go:70 +0x672\\nnet/http.HandlerFun"}
{"Time":"2020-08-01T06:33:19.861120999Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestAuthModeAlwaysAllow","Output":"netes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.(*ResponseWriterDelegator).WriteHeader(0xc010fd6780, 0x1f7)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:503 +0x45\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.(*deferredResponseWriter).Write(0xc010fb3ae0, 0xc004940000, 0xa3, 0x9ee, 0x0, 0x0, 0xc0087bc178)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:202 +0x1f7\\nencoding/json.(*Encoder).Encode(0xc0087bc198, 0x48bdb60, 0xc010fcee60, 0x0, 0x41147b)\\n\\t/usr/local/go/src/encoding/json/stream.go:231 +0x1cb\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).doEncode(0xc00014ea00, 0x5190840, 0xc010fcee60, 0x517f460, 0xc010fb3ae0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/ru"}
{"Time":"2020-08-01T06:33:19.861215173Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestAuthModeAlwaysAllow","Output":"ntime/serializer/json/json.go:326 +0x2e9\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).Encode(0xc00014ea00, 0x5190840, 0xc010fcee60, 0x517f460, 0xc010fb3ae0, 0x3aab064, 0x6)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:300 +0x169\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).doEncode(0xc010fcef00, 0x5190840, 0xc010fcee60, 0x517f460, 0xc010fb3ae0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:228 +0x396\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).Encode(0xc010fcef00, 0x5190840, 0xc010fcee60, 0x517f460, 0xc010fb3ae0, 0xc00014ea00, 0x5199580)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/version"}
{"Time":"2020-08-01T06:33:19.861435321Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestAuthModeAlwaysAllow","Output":"ing/versioning.go:184 +0x170\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.SerializeObject(0x4981202, 0x10, 0x7f30c00fd980, 0xc010fcef00, 0x51e4f20, 0xc01098efd8, 0xc010fd8600, 0x1f7, 0x5190840, 0xc010fcee60)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:96 +0x12c\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.WriteObjectNegotiated(0x51e8520, 0xc008b86ea0, 0x51e8860, 0x7474268, 0x0, 0x0, 0x4961f56, 0x2, 0x51e4f20, 0xc01098efd8, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:251 +0x572\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.ErrorNegotiated(0x517ecc0, 0xc010fcedc0, 0x51e8520, 0xc008b86ea0, 0x0, 0x0, 0x4961f56, 0x2, 0x51e4f20, 0xc01098efd8, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_out"}
{"Time":"2020-08-01T06:33:19.861598049Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestAuthModeAlwaysAllow","Output":"put/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:270 +0x16f\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.(*RequestScope).err(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:89\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.ConnectResource.func1.1()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:188 +0x259\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.RecordLongRunning(0xc010fd8600, 0xc010faf290, 0x496c673, 0x9, 0xc010fc2f90)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:340 +0x289\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.ConnectResource.func1(0x51e4f20, 0xc01098efd8, 0xc010fd8600)\\n\\t/home/prow/go/src/k8s.io"}
{"Time":"2020-08-01T06:33:19.861656788Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestAuthModeAlwaysAllow","Output":"/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:185 +0x472\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints.restfulConnectResource.func1(0xc010fd66f0, 0xc00cc6c620)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/installer.go:1211 +0x99\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.InstrumentRouteFunc.func1(0xc010fd66f0, 0xc00cc6c620)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:384 +0x282\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc008ffaea0, 0x7f30a273f8e8, 0xc01098efc8, 0xc010fd8600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:288 +0xa84\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).Dispatch(...)\\n\\t/home/prow/go/src/k8s.io"}
{"Time":"2020-08-01T06:33:19.861866888Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestAuthModeAlwaysAllow","Output":"io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/maxinflight.go:175 +0x4cf\\nnet/http.HandlerFunc.ServeHTTP(0xc008c921b0, 0x7f30a273f8e8, 0xc01098efc8, 0xc010fd8600)\\n\\t/usr/local/go/src/net/http/server.go:2042 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithImpersonation.func1(0x7f30a273f8e8, 0xc01098efc8, 0xc010fd8600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/impersonation.go:50 +0x203d\\nnet/http.HandlerFunc.ServeHTTP(0xc008344e40, 0x7f30a273f8e8, 0xc01098efc8, 0xc010fd8600)\\n\\t/usr/local/go/src/net/http/server.go:2042 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithAuthentication.func1(0x7f30a273f8e8, 0xc01098efc8, 0xc010fd8500)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authentication.go:70 +0x672\\nnet/http.HandlerFunc.ServeHTTP(0xc00559d7c0, 0x7f30a273f8e8, 0xc01098efc8, 0xc010fd8500"}
{"Time":"2020-08-01T06:33:19.861956065Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestAuthModeAlwaysAllow","Output":")\\n\\t/usr/local/go/src/net/http/server.go:2042 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1(0xc010f99b60, 0xc001fc85a0, 0x51ebb20, 0xc01098efc8, 0xc010fd8500)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:113 +0xb8\\ncreated by k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:99 +0x1cc\\n\" addedInfo=\"\\nlogging error output: \\\"{\\\\\\\"kind\\\\\\\":\\\\\\\"Status\\\\\\\",\\\\\\\"apiVersion\\\\\\\":\\\\\\\"v1\\\\\\\",\\\\\\\"metadata\\\\\\\":{},\\\\\\\"status\\\\\\\":\\\\\\\"Failure\\\\\\\",\\\\\\\"message\\\\\\\":\\\\\\\"no endpoints available for service \\\\\\\\\\\\\\\"a\\\\\\\\\\\\\\\"\\\\\\\",\\\\\\\"reason\\\\\\\":\\\\\\\"ServiceUnavailable\\\\\\\",\\\\\\\"code\\\\\\\":503}\\\\n\\\"\\n\"\n"}
{"Time":"2020-08-01T06:33:28.91492124Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestAliceNotForbiddenOrUnauthorized","Output":"o/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.(*ResponseWriterDelegator).WriteHeader(0xc018639080, 0x1f7)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:503 +0x45\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.(*deferredResponseWriter).Write(0xc018632dc0, 0xc008288400, 0xa3, 0xb1d, 0x0, 0x0, 0xc018468178)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:202 +0x1f7\\nencoding/json.(*Encoder).Encode(0xc018468198, 0x48bdb60, 0xc0186401e0, 0x0, 0x41147b)\\n\\t/usr/local/go/src/encoding/json/stream.go:231 +0x1cb\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).doEncode(0xc00014ea00, 0x5190840, 0xc0186401e0, 0x517f460, 0xc018632dc0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery"}
{"Time":"2020-08-01T06:33:28.914930567Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestAliceNotForbiddenOrUnauthorized","Output":"/pkg/runtime/serializer/json/json.go:326 +0x2e9\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).Encode(0xc00014ea00, 0x5190840, 0xc0186401e0, 0x517f460, 0xc018632dc0, 0x3aab064, 0x6)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:300 +0x169\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).doEncode(0xc018640280, 0x5190840, 0xc0186401e0, 0x517f460, 0xc018632dc0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:228 +0x396\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).Encode(0xc018640280, 0x5190840, 0xc0186401e0, 0x517f460, 0xc018632dc0, 0xc00014ea00, 0x5199580)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/"}
{"Time":"2020-08-01T06:33:28.914939166Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestAliceNotForbiddenOrUnauthorized","Output":"versioning/versioning.go:184 +0x170\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.SerializeObject(0x4981202, 0x10, 0x7f30c00fd980, 0xc018640280, 0x51e4f20, 0xc017ebb338, 0xc01863e200, 0x1f7, 0x5190840, 0xc0186401e0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:96 +0x12c\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.WriteObjectNegotiated(0x51e8520, 0xc010056960, 0x51e8860, 0x7474268, 0x0, 0x0, 0x4961f56, 0x2, 0x51e4f20, 0xc017ebb338, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:251 +0x572\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.ErrorNegotiated(0x517ecc0, 0xc018640140, 0x51e8520, 0xc010056960, 0x0, 0x0, 0x4961f56, 0x2, 0x51e4f20, 0xc017ebb338, ...)\\n\\t/home/prow/go/src/k8s.io/kubernet"}
{"Time":"2020-08-01T06:33:28.914947988Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestAliceNotForbiddenOrUnauthorized","Output":"es/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:270 +0x16f\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.(*RequestScope).err(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:89\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.ConnectResource.func1.1()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:188 +0x259\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.RecordLongRunning(0xc01863e200, 0xc0186078c0, 0x496c673, 0x9, 0xc00b570f90)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:340 +0x289\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.ConnectResource.func1(0x51e4f20, 0xc017ebb338, 0xc01863e200)\\n\\t/home/prow/go/src"}
{"Time":"2020-08-01T06:33:28.914956885Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestAliceNotForbiddenOrUnauthorized","Output":"/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:185 +0x472\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints.restfulConnectResource.func1(0xc018638ff0, 0xc0186343f0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/installer.go:1211 +0x99\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.InstrumentRouteFunc.func1(0xc018638ff0, 0xc0186343f0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:384 +0x282\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc0106ce2d0, 0x7f30a273f8e8, 0xc017ebb328, 0xc01863e200)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:288 +0xa84\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).Dispatch(...)\\n\\t/home/prow/go/src"}
{"Time":"2020-08-01T06:33:28.914978311Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestAliceNotForbiddenOrUnauthorized","Output":"rc/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/maxinflight.go:175 +0x4cf\\nnet/http.HandlerFunc.ServeHTTP(0xc0100520f0, 0x7f30a273f8e8, 0xc017ebb328, 0xc01863e200)\\n\\t/usr/local/go/src/net/http/server.go:2042 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithImpersonation.func1(0x7f30a273f8e8, 0xc017ebb328, 0xc01863e200)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/impersonation.go:50 +0x203d\\nnet/http.HandlerFunc.ServeHTTP(0xc00cef8a40, 0x7f30a273f8e8, 0xc017ebb328, 0xc01863e200)\\n\\t/usr/local/go/src/net/http/server.go:2042 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithAuthentication.func1(0x7f30a273f8e8, 0xc017ebb328, 0xc01863e100)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authentication.go:70 +0x672\\nnet/http.HandlerFunc.ServeHTTP(0xc00b8fdb30, 0x7f30a273f8e8, 0xc017ebb328, 0xc01"}
{"Time":"2020-08-01T06:33:28.914986926Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestAliceNotForbiddenOrUnauthorized","Output":"863e100)\\n\\t/usr/local/go/src/net/http/server.go:2042 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1(0xc018619800, 0xc00922f0c0, 0x51ebb20, 0xc017ebb328, 0xc01863e100)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:113 +0xb8\\ncreated by k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:99 +0x1cc\\n\" addedInfo=\"\\nlogging error output: \\\"{\\\\\\\"kind\\\\\\\":\\\\\\\"Status\\\\\\\",\\\\\\\"apiVersion\\\\\\\":\\\\\\\"v1\\\\\\\",\\\\\\\"metadata\\\\\\\":{},\\\\\\\"status\\\\\\\":\\\\\\\"Failure\\\\\\\",\\\\\\\"message\\\\\\\":\\\\\\\"no endpoints available for service \\\\\\\\\\\\\\\"a\\\\\\\\\\\\\\\"\\\\\\\",\\\\\\\"reason\\\\\\\":\\\\\\\"ServiceUnavailable\\\\\\\",\\\\\\\"code\\\\\\\":503}\\\\n\\\"\\n\"\n"}
{"Time":"2020-08-01T06:33:42.985390276Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestImpersonateIsForbidden","Output":"\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.(*ResponseWriterDelegator).WriteHeader(0xc01bb75d70, 0x1f4)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:503 +0x45\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.(*deferredResponseWriter).Write(0xc0189886e0, 0xc004d44000, 0xbb, 0xa36, 0x0, 0x0, 0xc01c9dbef8)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:202 +0x1f7\\nencoding/json.(*Encoder).Encode(0xc01c9dbf18, 0x48bdb60, 0xc01b345720, 0x0, 0x41147b)\\n\\t/usr/local/go/src/encoding/json/stream.go:231 +0x1cb\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).doEncode(0xc00014ea00, 0x5190840, 0xc01b345720, 0x517f460, 0xc0189886e0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apima"}
{"Time":"2020-08-01T06:33:42.985398115Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestImpersonateIsForbidden","Output":"chinery/pkg/runtime/serializer/json/json.go:326 +0x2e9\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).Encode(0xc00014ea00, 0x5190840, 0xc01b345720, 0x517f460, 0xc0189886e0, 0x3aab064, 0x6)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:300 +0x169\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).doEncode(0xc01b3457c0, 0x5190840, 0xc01b345720, 0x517f460, 0xc0189886e0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:228 +0x396\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).Encode(0xc01b3457c0, 0x5190840, 0xc01b345720, 0x517f460, 0xc0189886e0, 0xc00014ea00, 0x5199580)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/seri"}
{"Time":"2020-08-01T06:33:42.985405694Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestImpersonateIsForbidden","Output":"alizer/versioning/versioning.go:184 +0x170\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.SerializeObject(0x4981202, 0x10, 0x7f30c00fd980, 0xc01b3457c0, 0x51e4f20, 0xc004c46920, 0xc014eca700, 0x1f4, 0x5190840, 0xc01b345720)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:96 +0x12c\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.WriteObjectNegotiated(0x51e8520, 0xc00c51bc20, 0x51e8860, 0x7474268, 0x0, 0x0, 0x4961f56, 0x2, 0x51e4f20, 0xc004c46920, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:251 +0x572\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.ErrorNegotiated(0x5178da0, 0xc0184678a0, 0x51e8520, 0xc00c51bc20, 0x0, 0x0, 0x4961f56, 0x2, 0x51e4f20, 0xc004c46920, ...)\\n\\t/home/prow/go/src/k8s.io/k"}
{"Time":"2020-08-01T06:33:42.985416541Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestImpersonateIsForbidden","Output":"ubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:270 +0x16f\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.(*RequestScope).err(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:89\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.DeleteResource.func1(0x51e4f20, 0xc004c46920, 0xc014eca700)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/delete.go:95 +0x1a25\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints.restfulDeleteResource.func1(0xc01bb75ce0, 0xc0149ff490)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/installer.go:1175 +0x83\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.InstrumentRouteFunc.func1(0xc01bb75ce0, 0xc0149ff490)\\n\\t/home/prow/go/sr"}
{"Time":"2020-08-01T06:33:42.985423486Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestImpersonateIsForbidden","Output":"c/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:384 +0x282\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc018349710, 0x7f30a273f8e8, 0xc004c46910, 0xc014eca700)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:288 +0xa84\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).Dispatch(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:199\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.director.ServeHTTP(0x497a869, 0xe, 0xc018349710, 0xc004321a40, 0x7f30a273f8e8, 0xc004c46910, 0xc014eca700)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/handler.go:146 +0x539\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithAuthorization.func1(0x7f30a273f8e8, 0x"}
{"Time":"2020-08-01T06:33:42.985431135Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestImpersonateIsForbidden","Output":"c004c46910, 0xc014eca700)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authorization.go:64 +0x563\\nnet/http.HandlerFunc.ServeHTTP(0xc0086217c0, 0x7f30a273f8e8, 0xc004c46910, 0xc014eca700)\\n\\t/usr/local/go/src/net/http/server.go:2042 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithMaxInFlightLimit.func2(0x7f30a273f8e8, 0xc004c46910, 0xc014eca700)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/maxinflight.go:175 +0x4cf\\nnet/http.HandlerFunc.ServeHTTP(0xc00be3d290, 0x7f30a273f8e8, 0xc004c46910, 0xc014eca700)\\n\\t/usr/local/go/src/net/http/server.go:2042 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithImpersonation.func1(0x7f30a273f8e8, 0xc004c46910, 0xc014eca600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/impersonation.go:148 +0xc70\\"}
... skipping 452 lines ...
    limit_test.go:218: skipping expensive test
    --- SKIP: TestLimits/MergePatchType_nested_patch_under_limit (0.00s)

=== SKIP: vendor/k8s.io/apiextensions-apiserver/test/integration TestLimits/ApplyPatchType_nested_patch_under_limit (0.00s)
    limit_test.go:229: skipping expensive test
I0801 06:44:11.045933  121631 ???:1] sending watch cancel request for closed watcher{watch-id 11 0  <nil>}
W0801 06:44:11.045976  121631 ???:1] failed to send watch cancel request{watch-id 11 0  <nil>} {error 25 0  EOF}
W0801 06:44:11.046073  121631 cacher.go:148] Terminating all watchers from cacher *apiextensions.CustomResourceDefinition
I0801 06:44:11.048396  121631 ???:1] sending watch cancel request for closed watcher{watch-id 11 0  <nil>}
W0801 06:44:11.052212  121631 ???:1] failed to send watch cancel request{watch-id 11 0  <nil>} {error 25 0  EOF}
    --- SKIP: TestLimits/ApplyPatchType_nested_patch_under_limit (0.00s)


=== Failed
=== FAIL: test/integration/master  (0.00s)
I0801 06:37:39.197202  117526 etcd.go:81] etcd already running at http://127.0.0.1:2379
FAIL	k8s.io/kubernetes/test/integration/master	611.101s


DONE 2839 tests, 24 skipped, 1 failure in 12.195s
+++ [0801 06:48:06] Saved JUnit XML test report to /logs/artifacts/junit_20200801-062817.xml
make[1]: *** [Makefile:185: test] Error 1
!!! [0801 06:48:06] Call tree:
!!! [0801 06:48:06]  1: hack/make-rules/test-integration.sh:97 runTests(...)
+++ [0801 06:48:06] Cleaning up etcd
+++ [0801 06:48:07] Integration test cleanup complete
make: *** [Makefile:204: test-integration] Error 1
+ EXIT_VALUE=2
+ set +o xtrace
Cleaning up after docker in docker.
================================================================================
Cleaning up after docker
Stopping Docker: dockerProgram process in pidfile '/var/run/docker-ssd.pid', 1 process(es), refused to die.
... skipping 3 lines ...