Uploaded image for project: 'Network Controller'
  1. Network Controller
  2. SDNC-630

sdnc controller blueprint is crashing

XMLWordPrintable

      I'm deploying ONAP with OOM master and docker from Integration team staging version (https://git.onap.org/integration/plain/version-manifest/src/main/resources/docker-manifest-staging.csv).

      So it's deploying 'onap/ccsdk-controllerblueprints,0.4-STAGING-latest' version.

      At the end of the deployment,aai gizmo is crashing:

      onap-sdnc-controller-blueprints-6547954976-q8ctc               0/1       CrashLoopBackOff        66         5h
      

      describe of pod:

      Name:           onap-sdnc-controller-blueprints-6547954976-q8ctc
      Namespace:      onap
      Node:           compute04-onap-master.novalocal/10.253.0.25
      Start Time:     Fri, 08 Feb 2019 04:23:59 +0000
      Labels:         app=controller-blueprints
                      pod-template-hash=2103510532
                      release=onap-sdnc
      Annotations:    kubernetes.io/created-by={"kind":"SerializedReference","apiVersion":"v1","reference":{"kind":"ReplicaSet","namespace":"onap","name":"onap-sdnc-controller-blueprints-6547954976","uid":"5aee09eb-2b59-11...
      Status:         Running
      IP:             10.42.118.152
      Created By:     ReplicaSet/onap-sdnc-controller-blueprints-6547954976
      Controlled By:  ReplicaSet/onap-sdnc-controller-blueprints-6547954976
      Init Containers:
        controller-blueprints-readiness:
          Container ID:  docker://b827c2edfeef394eae88c91eff95b50ab2ed57eb19bbde75ea40c1fe3979441d
          Image:         oomk8s/readiness-check:2.0.0
          Image ID:      docker-pullable://oomk8s/readiness-check@sha256:7daa08b81954360a1111d03364febcb3dcfeb723bcc12ce3eb3ed3e53f2323ed
          Port:          <none>
          Command:
            /root/ready.py
          Args:
            --container-name
            controller-blueprints-db
          State:          Terminated
            Reason:       Completed
            Exit Code:    0
            Started:      Fri, 08 Feb 2019 04:26:19 +0000
            Finished:     Fri, 08 Feb 2019 04:31:36 +0000
          Ready:          True
          Restart Count:  0
          Environment:
            NAMESPACE:  onap (v1:metadata.namespace)
          Mounts:
            /var/run/secrets/kubernetes.io/serviceaccount from default-token-5bbrg (ro)
      Containers:
        controller-blueprints:
          Container ID:   docker://da6b380478fe8b3d3953dd41ec9190454015e80e71b4926b0fe59e3bb019123f
          Image:          nexus3.onap.org:10001/onap/ccsdk-controllerblueprints:0.4-STAGING-latest
          Image ID:       docker-pullable://nexus3.onap.org:10001/onap/ccsdk-controllerblueprints@sha256:a707922c04c5d39d91877f3ff1a95c02d75505f78277b3e307f1a91e5b9dd838
          Port:           8080/TCP
          State:          Waiting
            Reason:       CrashLoopBackOff
          Last State:     Terminated
            Reason:       Error
            Exit Code:    1
            Started:      Fri, 08 Feb 2019 09:44:03 +0000
            Finished:     Fri, 08 Feb 2019 09:44:10 +0000
          Ready:          False
          Restart Count:  66
          Liveness:       tcp-socket :8080 delay=10s timeout=1s period=10s #success=1 #failure=3
          Readiness:      tcp-socket :8080 delay=10s timeout=1s period=10s #success=1 #failure=3
          Environment:
            APPLICATIONNAME:    ControllerBluePrints
            BUNDLEVERSION:      1.0.0
            APP_CONFIG_HOME:    /opt/app/onap/config
            DB_URL:             jdbc:mysql://controller-blueprints-db:3306/sdnctl
            DB_USER:            sdnctl
            DB_PASSWORD:        <set to the key 'db-root-password' in secret 'onap-sdnc-controller-blueprints'>  Optional: false
            MS_USER:            <set to the key 'restUser' in secret 'onap-sdnc-controller-blueprints'>          Optional: false
            MS_PASSWORD:        <set to the key 'restPassword' in secret 'onap-sdnc-controller-blueprints'>      Optional: false
            INIT_DATA_LOAD:     true
            STICKYSELECTORKEY:
            ENVCONTEXT:         DEV
          Mounts:
            /etc/localtime from localtime (ro)
            /opt/app/onap/config/application.properties from onap-sdnc-controller-blueprints-config (rw)
            /opt/app/onap/config/logback.xml from onap-sdnc-controller-blueprints-config (rw)
            /var/run/secrets/kubernetes.io/serviceaccount from default-token-5bbrg (ro)
      Conditions:
        Type           Status
        Initialized    True
        Ready          False
        PodScheduled   True
      Volumes:
        localtime:
          Type:  HostPath (bare host directory volume)
          Path:  /etc/localtime
        onap-sdnc-controller-blueprints-config:
          Type:      ConfigMap (a volume populated by a ConfigMap)
          Name:      onap-sdnc-controller-blueprints-configmap
          Optional:  false
        default-token-5bbrg:
          Type:        Secret (a volume populated by a Secret)
          SecretName:  default-token-5bbrg
          Optional:    false
      QoS Class:       BestEffort
      Node-Selectors:  <none>
      Tolerations:     node.alpha.kubernetes.io/notReady:NoExecute for 300s
                       node.alpha.kubernetes.io/unreachable:NoExecute for 300s
      Events:
        Type     Reason      Age                  From                                      Message
        ----     ------      ----                 ----                                      -------
        Warning  FailedSync  13m (x1332 over 5h)  kubelet, compute04-onap-master.novalocal  Error syncing pod
        Warning  BackOff     8m (x1353 over 5h)   kubelet, compute04-onap-master.novalocal  Back-off restarting failed container
        Normal   Pulling     3m (x67 over 5h)     kubelet, compute04-onap-master.novalocal  pulling image "nexus3.onap.org:10001/onap/ccsdk-controllerblueprints:0.4-STAGING-latest"
      

       

            sdesbure sdesbure
            sdesbure sdesbure
            Votes:
            0 Vote for this issue
            Watchers:
            5 Start watching this issue

              Created:
              Updated:
              Resolved: