This job view page is being replaced by Spyglass soon. Check out the new job view.
PRleakingtapan: Switch to use new test framework
ResultFAILURE
Tests 0 failed / 0 succeeded
Started2019-09-07 01:47
Elapsed22m3s
Revision89dc52e578834dbebb37d0951b304f64165521d5
Refs 1009

No Test Failures!


Error lines from build-log.txt

... skipping 1130 lines ...

Using cluster from kubectl context: test-cluster-2962.k8s.local

Validating cluster test-cluster-2962.k8s.local


unexpected error during validation: error listing nodes: Get https://api-test-cluster-2962-k8s-ba7tol-1344859825.us-west-2.elb.amazonaws.com/api/v1/nodes: dial tcp: lookup api-test-cluster-2962-k8s-ba7tol-1344859825.us-west-2.elb.amazonaws.com on 10.63.240.10:53: no such host
Using cluster from kubectl context: test-cluster-2962.k8s.local

Validating cluster test-cluster-2962.k8s.local


unexpected error during validation: error listing nodes: Get https://api-test-cluster-2962-k8s-ba7tol-1344859825.us-west-2.elb.amazonaws.com/api/v1/nodes: dial tcp: lookup api-test-cluster-2962-k8s-ba7tol-1344859825.us-west-2.elb.amazonaws.com on 10.63.240.10:53: no such host
Using cluster from kubectl context: test-cluster-2962.k8s.local

Validating cluster test-cluster-2962.k8s.local


unexpected error during validation: error listing nodes: Get https://api-test-cluster-2962-k8s-ba7tol-1344859825.us-west-2.elb.amazonaws.com/api/v1/nodes: dial tcp: lookup api-test-cluster-2962-k8s-ba7tol-1344859825.us-west-2.elb.amazonaws.com on 10.63.240.10:53: no such host
Using cluster from kubectl context: test-cluster-2962.k8s.local

Validating cluster test-cluster-2962.k8s.local


unexpected error during validation: error listing nodes: Get https://api-test-cluster-2962-k8s-ba7tol-1344859825.us-west-2.elb.amazonaws.com/api/v1/nodes: EOF
Using cluster from kubectl context: test-cluster-2962.k8s.local

Validating cluster test-cluster-2962.k8s.local


unexpected error during validation: error listing nodes: Get https://api-test-cluster-2962-k8s-ba7tol-1344859825.us-west-2.elb.amazonaws.com/api/v1/nodes: EOF
Using cluster from kubectl context: test-cluster-2962.k8s.local

Validating cluster test-cluster-2962.k8s.local


unexpected error during validation: error listing nodes: Get https://api-test-cluster-2962-k8s-ba7tol-1344859825.us-west-2.elb.amazonaws.com/api/v1/nodes: EOF
Using cluster from kubectl context: test-cluster-2962.k8s.local

Validating cluster test-cluster-2962.k8s.local


unexpected error during validation: error listing nodes: Get https://api-test-cluster-2962-k8s-ba7tol-1344859825.us-west-2.elb.amazonaws.com/api/v1/nodes: EOF
Using cluster from kubectl context: test-cluster-2962.k8s.local

Validating cluster test-cluster-2962.k8s.local


unexpected error during validation: error listing nodes: Get https://api-test-cluster-2962-k8s-ba7tol-1344859825.us-west-2.elb.amazonaws.com/api/v1/nodes: EOF
Using cluster from kubectl context: test-cluster-2962.k8s.local

Validating cluster test-cluster-2962.k8s.local


unexpected error during validation: error listing nodes: Get https://api-test-cluster-2962-k8s-ba7tol-1344859825.us-west-2.elb.amazonaws.com/api/v1/nodes: EOF
Using cluster from kubectl context: test-cluster-2962.k8s.local

Validating cluster test-cluster-2962.k8s.local


unexpected error during validation: error listing nodes: Get https://api-test-cluster-2962-k8s-ba7tol-1344859825.us-west-2.elb.amazonaws.com/api/v1/nodes: EOF
Using cluster from kubectl context: test-cluster-2962.k8s.local

Validating cluster test-cluster-2962.k8s.local


unexpected error during validation: error listing nodes: Get https://api-test-cluster-2962-k8s-ba7tol-1344859825.us-west-2.elb.amazonaws.com/api/v1/nodes: EOF
Using cluster from kubectl context: test-cluster-2962.k8s.local

Validating cluster test-cluster-2962.k8s.local

INSTANCE GROUPS
NAME			ROLE	MACHINETYPE	MIN	MAX	SUBNETS
... skipping 7 lines ...
KIND	NAME			MESSAGE
Machine	i-0069200eae2db045e	machine "i-0069200eae2db045e" has not yet joined cluster
Machine	i-0071887dc7b5ec366	machine "i-0071887dc7b5ec366" has not yet joined cluster
Machine	i-0091de214f5fa839d	machine "i-0091de214f5fa839d" has not yet joined cluster
Machine	i-09694d5f00bfc9c99	machine "i-09694d5f00bfc9c99" has not yet joined cluster

Validation Failed
Using cluster from kubectl context: test-cluster-2962.k8s.local

Validating cluster test-cluster-2962.k8s.local

INSTANCE GROUPS
NAME			ROLE	MACHINETYPE	MIN	MAX	SUBNETS
... skipping 9 lines ...
Machine	i-0069200eae2db045e					machine "i-0069200eae2db045e" has not yet joined cluster
Machine	i-0091de214f5fa839d					machine "i-0091de214f5fa839d" has not yet joined cluster
Machine	i-09694d5f00bfc9c99					machine "i-09694d5f00bfc9c99" has not yet joined cluster
Pod	kube-system/kube-dns-685fbb458-cj6fh			kube-system pod "kube-dns-685fbb458-cj6fh" is pending
Pod	kube-system/kube-dns-autoscaler-74887878cc-fvslk	kube-system pod "kube-dns-autoscaler-74887878cc-fvslk" is pending

Validation Failed
Using cluster from kubectl context: test-cluster-2962.k8s.local

Validating cluster test-cluster-2962.k8s.local

INSTANCE GROUPS
NAME			ROLE	MACHINETYPE	MIN	MAX	SUBNETS
... skipping 9 lines ...

VALIDATION ERRORS
KIND	NAME										MESSAGE
Pod	kube-system/etcd-manager-events-ip-172-20-54-24.us-west-2.compute.internal	kube-system pod "etcd-manager-events-ip-172-20-54-24.us-west-2.compute.internal" is pending
Pod	kube-system/kube-dns-685fbb458-cj6fh						kube-system pod "kube-dns-685fbb458-cj6fh" is not ready (kubedns)

Validation Failed
Using cluster from kubectl context: test-cluster-2962.k8s.local

Validating cluster test-cluster-2962.k8s.local

INSTANCE GROUPS
NAME			ROLE	MACHINETYPE	MIN	MAX	SUBNETS
... skipping 8 lines ...
ip-172-20-70-59.us-west-2.compute.internal	node	True

VALIDATION ERRORS
KIND	NAME									MESSAGE
Pod	kube-system/kube-proxy-ip-172-20-57-34.us-west-2.compute.internal	kube-system pod "kube-proxy-ip-172-20-57-34.us-west-2.compute.internal" is pending

Validation Failed
Using cluster from kubectl context: test-cluster-2962.k8s.local

Validating cluster test-cluster-2962.k8s.local

INSTANCE GROUPS
NAME			ROLE	MACHINETYPE	MIN	MAX	SUBNETS
... skipping 285 lines ...
Ingress with multi-path echo backend
/home/prow/go/src/github.com/kubernetes-sigs/aws-alb-ingress-controller/test/e2e/ingress/multi_path_echo.go:190
  [mod-ip] should work [It]
  /home/prow/go/src/github.com/kubernetes-sigs/aws-alb-ingress-controller/test/e2e/ingress/multi_path_echo.go:212

  all targets in arn:aws:elasticloadbalancing:us-west-2:607362164682:targetgroup/c0ad6987-2c7b99edb154c164307/0ae32d35a2bf7be6 should be healthy
  Expected error:
      <*awserr.baseError | 0xc0008d9a40>: {
          code: "RequestCanceled",
          message: "request context canceled",
          errs: [{}],
      }
      RequestCanceled: request context canceled
... skipping 4 lines ...
------------------------------
Sep  7 02:07:58.774: INFO: Running AfterSuite actions on all nodes


Summarizing 1 Failure:

[Fail] Ingress with multi-path echo backend [It] [mod-ip] should work 
/home/prow/go/src/github.com/kubernetes-sigs/aws-alb-ingress-controller/test/e2e/ingress/shared/targets.go:36

Ran 2 of 2 Specs in 441.405 seconds
FAIL! -- 1 Passed | 1 Failed | 0 Pending | 0 Skipped
--- FAIL: TestE2E (441.41s)
FAIL

Ginkgo ran 1 suite in 8m9.820676726s
Test Suite Failed
2019/09/07 02:07:58 Deleting cluster test-cluster-2962.k8s.local
TYPE			NAME											ID
autoscaling-config	master-us-west-2a.masters.test-cluster-2962.k8s.local-20190907015127			master-us-west-2a.masters.test-cluster-2962.k8s.local-20190907015127
autoscaling-config	nodes.test-cluster-2962.k8s.local-20190907015127					nodes.test-cluster-2962.k8s.local-20190907015127
autoscaling-group	master-us-west-2a.masters.test-cluster-2962.k8s.local					master-us-west-2a.masters.test-cluster-2962.k8s.local
autoscaling-group	nodes.test-cluster-2962.k8s.local							nodes.test-cluster-2962.k8s.local
... skipping 192 lines ...
dhcp-options:dopt-0ed3a254d38dbc016	ok
Deleted kubectl config for test-cluster-2962.k8s.local

Deleted cluster: "test-cluster-2962.k8s.local"
2019/09/07 02:09:49 exit status 1
exit status 1
Makefile:58: recipe for target 'e2e-test' failed
make: *** [e2e-test] Error 1
+ EXIT_VALUE=2
+ set +o xtrace
Cleaning up after docker in docker.
================================================================================
[Barnacle] 2019/09/07 02:09:49 Cleaning up Docker data root...
[Barnacle] 2019/09/07 02:09:49 Removing all containers.
... skipping 25 lines ...