This job view page is being replaced by Spyglass soon. Check out the new job view.
ResultFAILURE
Tests 0 failed / 0 succeeded
Started2022-08-03 04:27
Elapsed16m31s
Revisionrelease-1.4

No Test Failures!


Error lines from build-log.txt

... skipping 639 lines ...
certificate.cert-manager.io "selfsigned-cert" deleted
# Create secret for AzureClusterIdentity
./hack/create-identity-secret.sh
make[2]: Entering directory '/home/prow/go/src/sigs.k8s.io/cluster-api-provider-azure'
make[2]: Nothing to be done for 'kubectl'.
make[2]: Leaving directory '/home/prow/go/src/sigs.k8s.io/cluster-api-provider-azure'
Error from server (NotFound): secrets "cluster-identity-secret" not found
secret/cluster-identity-secret created
secret/cluster-identity-secret labeled
# Create customized cloud provider configs
./hack/create-custom-cloud-provider-config.sh
make[2]: Entering directory '/home/prow/go/src/sigs.k8s.io/cluster-api-provider-azure'
make[2]: Nothing to be done for 'kubectl'.
... skipping 133 lines ...
configmap/csi-proxy-addon created
configmap/containerd-logger-capz-07m2yp created
clusterresourceset.addons.cluster.x-k8s.io/metrics-server-capz-07m2yp created
configmap/metrics-server-capz-07m2yp created
# Wait for the kubeconfig to become available.
timeout --foreground 300 bash -c "while ! /home/prow/go/src/sigs.k8s.io/cluster-api-provider-azure/hack/tools/bin/kubectl-v1.22.4 get secrets | grep capz-07m2yp-kubeconfig; do sleep 1; done"
make[1]: *** [Makefile:307: create-workload-cluster] Error 124
make[1]: Leaving directory '/home/prow/go/src/sigs.k8s.io/cluster-api-provider-azure'
make: *** [Makefile:337: create-cluster] Error 2
Collecting logs for cluster capz-07m2yp in namespace default and dumping logs to /logs/artifacts
INFO: Creating log watcher for controller capi-kubeadm-bootstrap-system/capi-kubeadm-bootstrap-controller-manager, pod capi-kubeadm-bootstrap-controller-manager-8447dbccc5-f9fsl, container manager
INFO: Creating log watcher for controller capi-kubeadm-control-plane-system/capi-kubeadm-control-plane-controller-manager, pod capi-kubeadm-control-plane-controller-manager-5d98d9dbdd-9jtwh, container manager
INFO: Creating log watcher for controller capi-system/capi-controller-manager, pod capi-controller-manager-746c779ddc-l7284, container manager
INFO: Creating log watcher for controller capz-system/capz-controller-manager, pod capz-controller-manager-59596978b5-q47t9, container manager
STEP: Dumping workload cluster default/capz-07m2yp logs
Aug  3 04:40:32.961: INFO: Collecting logs for Linux node capz-07m2yp-md-0-4zcbl in cluster capz-07m2yp in namespace default

Aug  3 04:41:32.962: INFO: Collecting boot logs for AzureMachine capz-07m2yp-md-0-4zcbl

Failed to get logs for machine capz-07m2yp-md-0-bbbb74ff9-vxbqw, cluster default/capz-07m2yp: [open /etc/azure-ssh/azure-ssh: no such file or directory, Unable to collect VM Boot Diagnostic logs: AzureMachine provider ID is nil]
Aug  3 04:41:32.994: INFO: Collecting logs for Linux node capz-07m2yp-md-0-rhdcd in cluster capz-07m2yp in namespace default

Aug  3 04:42:32.996: INFO: Collecting boot logs for AzureMachine capz-07m2yp-md-0-rhdcd

Failed to get logs for machine capz-07m2yp-md-0-bbbb74ff9-xw248, cluster default/capz-07m2yp: [open /etc/azure-ssh/azure-ssh: no such file or directory, Unable to collect VM Boot Diagnostic logs: AzureMachine provider ID is nil]
STEP: Dumping workload cluster default/capz-07m2yp kube-system pod logs
panic: Timed out after 60.001s.
Failed to get default/capz-07m2yp-kubeconfig
Expected success, but got an error:
    <*errors.StatusError | 0xc001206b40>: {
        ErrStatus: {
            TypeMeta: {Kind: "", APIVersion: ""},
            ListMeta: {
                SelfLink: "",
                ResourceVersion: "",
... skipping 14 lines ...
            Code: 404,
        },
    }
    secrets "capz-07m2yp-kubeconfig" not found

goroutine 1 [running]:
main.Fail({0xc000832380?, 0x16?}, {0xc000be90d0?, 0x5?, 0x5?})
	/home/prow/go/src/sigs.k8s.io/cluster-api-provider-azure/test/logger.go:37 +0x2d
github.com/onsi/gomega/internal.(*AsyncAssertion).match.func1({0x2a9966e, 0x9})
	/home/prow/go/pkg/mod/github.com/onsi/gomega@v1.17.0/internal/async_assertion.go:184 +0x31d
github.com/onsi/gomega/internal.(*AsyncAssertion).match(0xc000113e00, {0x2db1510, 0x3f1bf20}, 0x1, {0xc000a00460, 0x2, 0x2})
	/home/prow/go/pkg/mod/github.com/onsi/gomega@v1.17.0/internal/async_assertion.go:206 +0x578
github.com/onsi/gomega/internal.(*AsyncAssertion).Should(0xc000113e00, {0x2db1510, 0x3f1bf20}, {0xc000a00460, 0x2, 0x2})
... skipping 23 lines ...