This job view page is being replaced by Spyglass soon. Check out the new job view.
ResultFAILURE
Tests 57 failed / 720 succeeded
Started2022-07-11 16:29
Elapsed53m15s
Revisionmaster

Test Failures


Kubernetes e2e suite [sig-apps] StatefulSet Basic StatefulSet functionality [StatefulSetBasic] should perform rolling updates and roll backs of template modifications with PVCs 10m11s

go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[sig\-apps\]\sStatefulSet\sBasic\sStatefulSet\sfunctionality\s\[StatefulSetBasic\]\sshould\sperform\srolling\supdates\sand\sroll\sbacks\sof\stemplate\smodifications\swith\sPVCs$'
/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/apps/statefulset.go:288
Jul 11 17:03:53.726: Failed waiting for pods to enter running: timed out waiting for the condition
/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/statefulset/wait.go:80
				
				Click to see stdout/stderrfrom junit_18.xml

Filter through log files | View test history on testgrid


Kubernetes e2e suite [sig-auth] ServiceAccounts ServiceAccountIssuerDiscovery should support OIDC discovery of service account issuer [Conformance] 1m7s

go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[sig\-auth\]\sServiceAccounts\sServiceAccountIssuerDiscovery\sshould\ssupport\sOIDC\sdiscovery\sof\sservice\saccount\sissuer\s\[Conformance\]$'
/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:630
Jul 11 17:02:02.415: Unexpected error:
    <*errors.errorString | 0xc0022127a0>: {
        s: "pod \"oidc-discovery-validator\" failed with status: {Phase:Failed Conditions:[{Type:Initialized Status:True LastProbeTime:0001-01-01 00:00:00 +0000 UTC LastTransitionTime:2022-07-11 17:01:02 +0000 UTC Reason: Message:} {Type:Ready Status:False LastProbeTime:0001-01-01 00:00:00 +0000 UTC LastTransitionTime:2022-07-11 17:01:29 +0000 UTC Reason:PodFailed Message:} {Type:ContainersReady Status:False LastProbeTime:0001-01-01 00:00:00 +0000 UTC LastTransitionTime:2022-07-11 17:01:29 +0000 UTC Reason:PodFailed Message:} {Type:PodScheduled Status:True LastProbeTime:0001-01-01 00:00:00 +0000 UTC LastTransitionTime:2022-07-11 17:01:02 +0000 UTC Reason: Message:}] Message: Reason: NominatedNodeName: HostIP:172.20.57.144 PodIP:100.108.209.53 PodIPs:[{IP:100.108.209.53}] StartTime:2022-07-11 17:01:02 +0000 UTC InitContainerStatuses:[] ContainerStatuses:[{Name:oidc-discovery-validator State:{Waiting:nil Running:nil Terminated:&ContainerStateTerminated{ExitCode:1,Signal:0,Reason:Error,Message:,StartedAt:2022-07-11 17:01:02 +0000 UTC,FinishedAt:2022-07-11 17:01:27 +0000 UTC,ContainerID:containerd://7e0d3945683318b94af20a3199c84534fa57e09891b503a1bb353423856a8427,}} LastTerminationState:{Waiting:nil Running:nil Terminated:nil} Ready:false RestartCount:0 Image:k8s.gcr.io/e2e-test-images/agnhost:2.39 ImageID:k8s.gcr.io/e2e-test-images/agnhost@sha256:7e8bdd271312fd25fc5ff5a8f04727be84044eb3d7d8d03611972a6752e2e11e ContainerID:containerd://7e0d3945683318b94af20a3199c84534fa57e09891b503a1bb353423856a8427 Started:0xc000923f95}] QOSClass:BestEffort EphemeralContainerStatuses:[]}",
    }
    pod "oidc-discovery-validator" failed with status: {Phase:Failed Conditions:[{Type:Initialized Status:True LastProbeTime:0001-01-01 00:00:00 +0000 UTC LastTransitionTime:2022-07-11 17:01:02 +0000 UTC Reason: Message:} {Type:Ready Status:False LastProbeTime:0001-01-01 00:00:00 +0000 UTC LastTransitionTime:2022-07-11 17:01:29 +0000 UTC Reason:PodFailed Message:} {Type:ContainersReady Status:False LastProbeTime:0001-01-01 00:00:00 +0000 UTC LastTransitionTime:2022-07-11 17:01:29 +0000 UTC Reason:PodFailed Message:} {Type:PodScheduled Status:True LastProbeTime:0001-01-01 00:00:00 +0000 UTC LastTransitionTime:2022-07-11 17:01:02 +0000 UTC Reason: Message:}] Message: Reason: NominatedNodeName: HostIP:172.20.57.144 PodIP:100.108.209.53 PodIPs:[{IP:100.108.209.53}] StartTime:2022-07-11 17:01:02 +0000 UTC InitContainerStatuses:[] ContainerStatuses:[{Name:oidc-discovery-validator State:{Waiting:nil Running:nil Terminated:&ContainerStateTerminated{ExitCode:1,Signal:0,Reason:Error,Message:,StartedAt:2022-07-11 17:01:02 +0000 UTC,FinishedAt:2022-07-11 17:01:27 +0000 UTC,ContainerID:containerd://7e0d3945683318b94af20a3199c84534fa57e09891b503a1bb353423856a8427,}} LastTerminationState:{Waiting:nil Running:nil Terminated:nil} Ready:false RestartCount:0 Image:k8s.gcr.io/e2e-test-images/agnhost:2.39 ImageID:k8s.gcr.io/e2e-test-images/agnhost@sha256:7e8bdd271312fd25fc5ff5a8f04727be84044eb3d7d8d03611972a6752e2e11e ContainerID:containerd://7e0d3945683318b94af20a3199c84534fa57e09891b503a1bb353423856a8427 Started:0xc000923f95}] QOSClass:BestEffort EphemeralContainerStatuses:[]}
occurred
/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/auth/service_accounts.go:789
				
				Click to see stdout/stderrfrom junit_19.xml

Find oidc-discovery-validator mentions in log files | View test history on testgrid


Kubernetes e2e suite [sig-cli] Kubectl client Guestbook application should create and stop a working application [Conformance] 10m32s

go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[sig\-cli\]\sKubectl\sclient\sGuestbook\sapplication\sshould\screate\sand\sstop\sa\sworking\sapplication\s\s\[Conformance\]$'
/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:630
Jul 11 17:07:26.024: Frontend service did not start serving content in 600 seconds.
/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/kubectl/kubectl.go:375
				
				Click to see stdout/stderrfrom junit_01.xml

Filter through log files | View test history on testgrid


Kubernetes e2e suite [sig-cli] Kubectl client Simple pod should handle in-cluster config 3m17s

go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[sig\-cli\]\sKubectl\sclient\sSimple\spod\sshould\shandle\sin\-cluster\sconfig$'
/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/kubectl/kubectl.go:646
Jul 11 16:58:17.002: Unexpected error:
    <exec.CodeExitError>: {
        Err: {
            s: "error running /home/prow/go/src/k8s.io/kops/_rundir/9001703b-0136-11ed-9fb7-3e7cb3b25c5e/kubectl --server=https://api.e2e-e2e-kops-grid-calico-u2004-k22-containerd.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=kubectl-4720 exec httpd -- /bin/sh -x -c /tmp/kubectl get pods --kubeconfig=/tmp/icc-override.kubeconfig --v=6 2>&1:\nCommand stdout:\nI0711 16:57:06.886443     241 loader.go:372] Config loaded from file:  /tmp/icc-override.kubeconfig\nI0711 16:57:16.897600     241 round_trippers.go:454] GET https://kubernetes.default.svc:443/api?timeout=32s  in 10009 milliseconds\nI0711 16:57:16.897659     241 cached_discovery.go:121] skipped caching discovery info due to Get \"https://kubernetes.default.svc:443/api?timeout=32s\": dial tcp: lookup kubernetes.default.svc on 100.64.0.10:53: read udp 100.127.59.127:49382->100.64.0.10:53: read: connection refused\nI0711 16:57:31.901640     241 round_trippers.go:454] GET https://kubernetes.default.svc:443/api?timeout=32s  in 15003 milliseconds\nI0711 16:57:31.902034     241 cached_discovery.go:121] skipped caching discovery info due to Get \"https://kubernetes.default.svc:443/api?timeout=32s\": dial tcp: lookup kubernetes.default.svc on 100.64.0.10:53: read udp 100.127.59.127:41250->100.64.0.10:53: read: connection refused\nI0711 16:57:31.902222     241 shortcut.go:89] Error loading discovery information: Get \"https://kubernetes.default.svc:443/api?timeout=32s\": dial tcp: lookup kubernetes.default.svc on 100.64.0.10:53: read udp 100.127.59.127:41250->100.64.0.10:53: read: connection refused\nI0711 16:57:46.910019     241 round_trippers.go:454] GET https://kubernetes.default.svc:443/api?timeout=32s  in 15007 milliseconds\nI0711 16:57:46.910092     241 cached_discovery.go:121] skipped caching discovery info due to Get \"https://kubernetes.default.svc:443/api?timeout=32s\": dial tcp: lookup kubernetes.default.svc on 100.64.0.10:53: read udp 100.127.59.127:46195->100.64.0.10:53: read: connection refused\nI0711 16:58:01.914760     241 round_trippers.go:454] GET https://kubernetes.default.svc:443/api?timeout=32s  in 15004 milliseconds\nI0711 16:58:01.917772     241 cached_discovery.go:121] skipped caching discovery info due to Get \"https://kubernetes.default.svc:443/api?timeout=32s\": dial tcp: lookup kubernetes.default.svc on 100.64.0.10:53: read udp 100.127.59.127:37053->100.64.0.10:53: read: connection refused\nI0711 16:58:16.922790     241 round_trippers.go:454] GET https://kubernetes.default.svc:443/api?timeout=32s  in 15004 milliseconds\nI0711 16:58:16.922841     241 cached_discovery.go:121] skipped caching discovery info due to Get \"https://kubernetes.default.svc:443/api?timeout=32s\": dial tcp: lookup kubernetes.default.svc on 100.64.0.10:53: read udp 100.127.59.127:59304->100.64.0.10:53: read: connection refused\nI0711 16:58:16.922888     241 helpers.go:235] Connection error: Get https://kubernetes.default.svc:443/api?timeout=32s: dial tcp: lookup kubernetes.default.svc on 100.64.0.10:53: read udp 100.127.59.127:59304->100.64.0.10:53: read: connection refused\nF0711 16:58:16.922906     241 helpers.go:116] The connection to the server kubernetes.default.svc:443 was refused - did you specify the right host or port?\ngoroutine 1 [running]:\nk8s.io/kubernetes/vendor/k8s.io/klog/v2.stacks(0xc00000e001, 0xc000550000, 0x9c, 0x14c)\n\t/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:1026 +0xb9\nk8s.io/kubernetes/vendor/k8s.io/klog/v2.(*loggingT).output(0x30d3380, 0xc000000003, 0x0, 0x0, 0xc0000d2310, 0x2, 0x27f46b8, 0xa, 0x74, 0x40e300)\n\t/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:975 +0x1e5\nk8s.io/kubernetes/vendor/k8s.io/klog/v2.(*loggingT).printDepth(0x30d3380, 0xc000000003, 0x0, 0x0, 0x0, 0x0, 0x2, 0xc00038bc70, 0x1, 0x1)\n\t/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:735 +0x185\nk8s.io/kubernetes/vendor/k8s.io/klog/v2.FatalDepth(...)\n\t/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:1500\nk8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util.fatal(0xc000148000, 0x6d, 0x1)\n\t/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util/helpers.go:94 +0x288\nk8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util.checkErr(0x226a860, 0xc0003258f0, 0x20ec0f0)\n\t/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util/helpers.go:189 +0x935\nk8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util.CheckErr(...)\n\t/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util/helpers.go:116\nk8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/get.NewCmdGet.func2(0xc000092a00, 0xc000594900, 0x1, 0x3)\n\t/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/get/get.go:180 +0x159\nk8s.io/kubernetes/vendor/github.com/spf13/cobra.(*Command).execute(0xc000092a00, 0xc0005948d0, 0x3, 0x3, 0xc000092a00, 0xc0005948d0)\n\t/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/github.com/spf13/cobra/command.go:856 +0x2c2\nk8s.io/kubernetes/vendor/github.com/spf13/cobra.(*Command).ExecuteC(0xc0001ad680, 0xc00007e180, 0xc00003a050, 0x5)\n\t/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/github.com/spf13/cobra/command.go:960 +0x375\nk8s.io/kubernetes/vendor/github.com/spf13/cobra.(*Command).Execute(...)\n\t/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/github.com/spf13/cobra/command.go:897\nmain.main()\n\t_output/dockerized/go/src/k8s.io/kubernetes/cmd/kubectl/kubectl.go:49 +0x1f7\n\ngoroutine 6 [chan receive]:\nk8s.io/kubernetes/vendor/k8s.io/klog/v2.(*loggingT).flushDaemon(0x30d3380)\n\t/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:1169 +0x8b\ncreated by k8s.io/kubernetes/vendor/k8s.io/klog/v2.init.0\n\t/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:420 +0xdf\n\ngoroutine 9 [select]:\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x20ebff8, 0x2268d20, 0xc0006550b0, 0x1, 0xc000048b40)\n\t/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:167 +0x118\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x20ebff8, 0x12a05f200, 0x0, 0x1, 0xc000048b40)\n\t/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:133 +0x98\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.Until(0x20ebff8, 0x12a05f200, 0xc000048b40)\n\t/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:90 +0x4d\ncreated by k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/util/logs.InitLogs\n\t/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/util/logs/logs.go:51 +0x96\n\ngoroutine 75 [runnable]:\nnet/http.setRequestCancel.func4(0x0, 0xc000153800, 0xc00003b5e0, 0xc000045f68, 0xc000048d80)\n\t/usr/local/go/src/net/http/client.go:397 +0x96\ncreated by net/http.setRequestCancel\n\t/usr/local/go/src/net/http/client.go:396 +0x337\n\nstderr:\n+ /tmp/kubectl get pods '--kubeconfig=/tmp/icc-override.kubeconfig' '--v=6'\ncommand terminated with exit code 255\n\nerror:\nexit status 255",
        },
        Code: 255,
    }
    error running /home/prow/go/src/k8s.io/kops/_rundir/9001703b-0136-11ed-9fb7-3e7cb3b25c5e/kubectl --server=https://api.e2e-e2e-kops-grid-calico-u2004-k22-containerd.test-cncf-aws.k8s.io --kubeconfig=/root/.kube/config --namespace=kubectl-4720 exec httpd -- /bin/sh -x -c /tmp/kubectl get pods --kubeconfig=/tmp/icc-override.kubeconfig --v=6 2>&1:
    Command stdout:
    I0711 16:57:06.886443     241 loader.go:372] Config loaded from file:  /tmp/icc-override.kubeconfig
    I0711 16:57:16.897600     241 round_trippers.go:454] GET https://kubernetes.default.svc:443/api?timeout=32s  in 10009 milliseconds
    I0711 16:57:16.897659     241 cached_discovery.go:121] skipped caching discovery info due to Get "https://kubernetes.default.svc:443/api?timeout=32s": dial tcp: lookup kubernetes.default.svc on 100.64.0.10:53: read udp 100.127.59.127:49382->100.64.0.10:53: read: connection refused
    I0711 16:57:31.901640     241 round_trippers.go:454] GET https://kubernetes.default.svc:443/api?timeout=32s  in 15003 milliseconds
    I0711 16:57:31.902034     241 cached_discovery.go:121] skipped caching discovery info due to Get "https://kubernetes.default.svc:443/api?timeout=32s": dial tcp: lookup kubernetes.default.svc on 100.64.0.10:53: read udp 100.127.59.127:41250->100.64.0.10:53: read: connection refused
    I0711 16:57:31.902222     241 shortcut.go:89] Error loading discovery information: Get "https://kubernetes.default.svc:443/api?timeout=32s": dial tcp: lookup kubernetes.default.svc on 100.64.0.10:53: read udp 100.127.59.127:41250->100.64.0.10:53: read: connection refused
    I0711 16:57:46.910019     241 round_trippers.go:454] GET https://kubernetes.default.svc:443/api?timeout=32s  in 15007 milliseconds
    I0711 16:57:46.910092     241 cached_discovery.go:121] skipped caching discovery info due to Get "https://kubernetes.default.svc:443/api?timeout=32s": dial tcp: lookup kubernetes.default.svc on 100.64.0.10:53: read udp 100.127.59.127:46195->100.64.0.10:53: read: connection refused
    I0711 16:58:01.914760     241 round_trippers.go:454] GET https://kubernetes.default.svc:443/api?timeout=32s  in 15004 milliseconds
    I0711 16:58:01.917772     241 cached_discovery.go:121] skipped caching discovery info due to Get "https://kubernetes.default.svc:443/api?timeout=32s": dial tcp: lookup kubernetes.default.svc on 100.64.0.10:53: read udp 100.127.59.127:37053->100.64.0.10:53: read: connection refused
    I0711 16:58:16.922790     241 round_trippers.go:454] GET https://kubernetes.default.svc:443/api?timeout=32s  in 15004 milliseconds
    I0711 16:58:16.922841     241 cached_discovery.go:121] skipped caching discovery info due to Get "https://kubernetes.default.svc:443/api?timeout=32s": dial tcp: lookup kubernetes.default.svc on 100.64.0.10:53: read udp 100.127.59.127:59304->100.64.0.10:53: read: connection refused
    I0711 16:58:16.922888     241 helpers.go:235] Connection error: Get https://kubernetes.default.svc:443/api?timeout=32s: dial tcp: lookup kubernetes.default.svc on 100.64.0.10:53: read udp 100.127.59.127:59304->100.64.0.10:53: read: connection refused
    F0711 16:58:16.922906     241 helpers.go:116] The connection to the server kubernetes.default.svc:443 was refused - did you specify the right host or port?
    goroutine 1 [running]:
    k8s.io/kubernetes/vendor/k8s.io/klog/v2.stacks(0xc00000e001, 0xc000550000, 0x9c, 0x14c)
    	/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:1026 +0xb9
    k8s.io/kubernetes/vendor/k8s.io/klog/v2.(*loggingT).output(0x30d3380, 0xc000000003, 0x0, 0x0, 0xc0000d2310, 0x2, 0x27f46b8, 0xa, 0x74, 0x40e300)
    	/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:975 +0x1e5
    k8s.io/kubernetes/vendor/k8s.io/klog/v2.(*loggingT).printDepth(0x30d3380, 0xc000000003, 0x0, 0x0, 0x0, 0x0, 0x2, 0xc00038bc70, 0x1, 0x1)
    	/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:735 +0x185
    k8s.io/kubernetes/vendor/k8s.io/klog/v2.FatalDepth(...)
    	/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:1500
    k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util.fatal(0xc000148000, 0x6d, 0x1)
    	/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util/helpers.go:94 +0x288
    k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util.checkErr(0x226a860, 0xc0003258f0, 0x20ec0f0)
    	/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util/helpers.go:189 +0x935
    k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util.CheckErr(...)
    	/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util/helpers.go:116
    k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/get.NewCmdGet.func2(0xc000092a00, 0xc000594900, 0x1, 0x3)
    	/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/get/get.go:180 +0x159
    k8s.io/kubernetes/vendor/github.com/spf13/cobra.(*Command).execute(0xc000092a00, 0xc0005948d0, 0x3, 0x3, 0xc000092a00, 0xc0005948d0)
    	/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/github.com/spf13/cobra/command.go:856 +0x2c2
    k8s.io/kubernetes/vendor/github.com/spf13/cobra.(*Command).ExecuteC(0xc0001ad680, 0xc00007e180, 0xc00003a050, 0x5)
    	/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/github.com/spf13/cobra/command.go:960 +0x375
    k8s.io/kubernetes/vendor/github.com/spf13/cobra.(*Command).Execute(...)
    	/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/github.com/spf13/cobra/command.go:897
    main.main()
    	_output/dockerized/go/src/k8s.io/kubernetes/cmd/kubectl/kubectl.go:49 +0x1f7
    
    goroutine 6 [chan receive]:
    k8s.io/kubernetes/vendor/k8s.io/klog/v2.(*loggingT).flushDaemon(0x30d3380)
    	/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:1169 +0x8b
    created by k8s.io/kubernetes/vendor/k8s.io/klog/v2.init.0
    	/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:420 +0xdf
    
    goroutine 9 [select]:
    k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x20ebff8, 0x2268d20, 0xc0006550b0, 0x1, 0xc000048b40)
    	/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:167 +0x118
    k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x20ebff8, 0x12a05f200, 0x0, 0x1, 0xc000048b40)
    	/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:133 +0x98
    k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.Until(0x20ebff8, 0x12a05f200, 0xc000048b40)
    	/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:90 +0x4d
    created by k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/util/logs.InitLogs
    	/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/util/logs/logs.go:51 +0x96
    
    goroutine 75 [runnable]:
    net/http.setRequestCancel.func4(0x0, 0xc000153800, 0xc00003b5e0, 0xc000045f68, 0xc000048d80)
    	/usr/local/go/src/net/http/client.go:397 +0x96
    created by net/http.setRequestCancel
    	/usr/local/go/src/net/http/client.go:396 +0x337
    
    stderr:
    + /tmp/kubectl get pods '--kubeconfig=/tmp/icc-override.kubeconfig' '--v=6'
    command terminated with exit code 255
    
    error:
    exit status 255
occurred
/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/util.go:1102
				
				Click to see stdout/stderrfrom junit_13.xml

Filter through log files | View test history on testgrid


Kubernetes e2e suite [sig-network] Conntrack should drop INVALID conntrack entries 1m16s

go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[sig\-network\]\sConntrack\sshould\sdrop\sINVALID\sconntrack\sentries$'
/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/network/conntrack.go:361
Jul 11 16:59:40.987: Boom server pod did not sent any bad packet to the client
/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/github.com/onsi/ginkgo/internal/leafnodes/runner.go:113
				
				Click to see stdout/stderrfrom junit_07.xml

Find did mentions in log files | View test history on testgrid


Kubernetes e2e suite [sig-network] DNS should provide /etc/hosts entries for the cluster [LinuxOnly] [Conformance] 10m24s

go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[sig\-network\]\sDNS\sshould\sprovide\s\/etc\/hosts\sentries\sfor\sthe\scluster\s\[LinuxOnly\]\s\[Conformance\]$'
/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:630
Jul 11 17:08:17.324: Unexpected error:
    <*errors.errorString | 0xc00024a290>: {
        s: "timed out waiting for the condition",
    }
    timed out waiting for the condition
occurred
/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/network/dns_common.go:463
				
				Click to see stdout/stderrfrom junit_08.xml

Filter through log files | View test history on testgrid


Kubernetes e2e suite [sig-network] DNS should provide DNS for pods for Hostname [LinuxOnly] [Conformance] 10m13s

go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[sig\-network\]\sDNS\sshould\sprovide\sDNS\sfor\spods\sfor\sHostname\s\[LinuxOnly\]\s\[Conformance\]$'
/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:630
Jul 11 17:04:04.083: Unexpected error:
    <*errors.errorString | 0xc00033a280>: {
        s: "timed out waiting for the condition",
    }
    timed out waiting for the condition
occurred
/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/network/dns_common.go:463
				
				Click to see stdout/stderrfrom junit_10.xml

Filter through log files | View test history on testgrid


Kubernetes e2e suite [sig-network] DNS should provide DNS for services [Conformance] 10m19s

go run hack/e2e.go -v --test --test_args='--ginkgo.focus=Kubernetes\se2e\ssuite\s\[sig\-network\]\sDNS\sshould\sprovide\sDNS\sfor\sservices\s\s\[Conformance\]$'
/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:630
Jul 11 17:10:04.907: Unexpected error:
    <*errors.errorString | 0xc000252290>: {
        s: "timed out waiting for the condition",
    }
    timed out waiting for the condition
occurred
/workspace/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/network/dns_common.go:463