Job canceled to protect the VDI chain



  • There's been a lot written about this but none seems to match my situation.

    I'm trying to back up a single machine XenServer 7.2 in a pool which has VMs on NFS storage. The VMs that are off backup successfully however all the VMs that are running give the Job cancelled to protect the VDI chain error.

    None of the machines have any snapshots. How can I resolve this and get the backups running?



  • test your connection to nfs storage (Settings->Remote)
    you can even delete it and add it again.

    Delete your backup schedule for this machine, create a new one and then try again.



  • Might help if you posted the backup logs showing the errors.



  • it's happening the same to me. how can I help ?



  • The problem I had seemed to be related to the fact that XenServers memory space was full. As soon as I turned off one of the VMs the backups started to run cleanly.



  • But my two pools are plenty of RAM it has to be anything else



  • Look under Dashboard > Health. Do you see anything under the Alarms section?

    Also you should review this for a better understanding of this issue -- https://xen-orchestra.com/docs/backup_troubleshooting.html



  • no, there are no Alarms. Also there is RAM and enough room in the SR and the machine that I want to Delta Backup has no snapshots, and the destination pool is empty (no vms at all).

    I have only noticed that the snapshots time-hour is 2 hour less that I have configured in my computer (Europe +2)



  • Are you able to replicate this issue with the following steps?

    1. Confirm that no alerts exist in Health section
    2. Manually execute backup job


  • hi thanks.

    actually I was doing the backup manually, and no there's no alerts.
    by now I tried again a machine after a while and this machine now is doing the backup CR.

    by the way the job that says "interruped", I don't know why it happened I didn't touch anything.

    0_1522952101410_Selecció_033.png



  • Can you tell us more about your SR?

    FWIW, each time you run a Delta or CR backup, a snapshot gets created / removed. Did you read the link I posted earlier?

    Another good read -- https://xen-orchestra.com/blog/xenserver-coalesce-detection-in-xen-orchestra/



  • sure. at this moment I have two pools.
    Production with xenserver 7.1 with two SR NFS (iSCSI 10Gb)
    Development with XCP-ng 7.4 with 1 SR HBA (fiber channel)
    also I have another one SR used development, NFS, also used as external device for backup.

    at this moment I'm using the Development pool, the machines I copy come from the production pool



  • also when the backup is stopped in the destination pool I find a machined called
    "[Importing...] VMname (date time)"



  • hi, I've done a new test.

    3 VM, one in different SR (2 on a HBA , 1 on a NFS)
    called SR600, SR900 HBA ones and SRSYN NFS one.
    same error on all of them. they are debian9, copied from the first of them.
    Xenserver 7.1
    Xoa 5.18

    backuping to a XCP-ng 7.4

    backup-ng:
    VM: x-debian9-testXO-900 (Dell-pool)
    tag: CR-debian9-testXO*
    Start: Apr 6, 2018, 9:19:27 AM
    [{"message":"VDI_IO_ERROR(Device I/O errors)","stack":"XapiError: VDI_IO_ERROR(Device I/O errors)\n at wrapError (/opt/xen-orchestra/packages/xen-api/src/index.js:111:9)\n at getTaskResult (/opt/xen-orchestra/packages/xen-api/src/index.js:189:22)\n at Xapi._addObject (/opt/xen-orchestra/packages/xen-api/src/index.js:796:8)\n at /opt/xen-orchestra/packages/xen-api/src/index.js:832:13\n at arrayEach (/opt/xen-orchestra/node_modules/lodash/_arrayEach.js:15:9)\n at forEach (/opt/xen-orchestra/node_modules/lodash/forEach.js:38:10)\n at Xapi._processEvents (/opt/xen-orchestra/packages/xen-api/src/index.js:827:12)\n at onSuccess (/opt/xen-orchestra/packages/xen-api/src/index.js:850:11)\n at run (/opt/xen-orchestra/node_modules/core-js/modules/es6.promise.js:66:22)\n at /opt/xen-orchestra/node_modules/core-js/modules/es6.promise.js:79:30\n at flush (/opt/xen-orchestra/node_modules/core-js/modules/_microtask.js:18:9)\n at process._tickCallback (internal/process/next_tick.js:112:11)","code":"VDI_IO_ERROR","params":["Device I/O errors"],"url":"https://192.168.222.202/import_raw_vdi/?format=vhd&vdi=OpaqueRef%3A5bd7e5fc-2e3d-4b8f-b054-1ce70885ccb2&session_id=OpaqueRef%3Add4a20ae-01cc-4074-98b9-3ce160a0d9e2&task_id=OpaqueRef%3Ace5bcd03-5b41-4dbb-9c9f-1ad0af070480"}]
    
    backup-ng:
    VM: x-debian9-testXO-SYN (Dell-pool)
    tag: CR-debian9-testXO*
    Start: Apr 6, 2018, 9:19:27 AM
    [{"message":"VDI_IO_ERROR(Device I/O errors)","stack":"XapiError: VDI_IO_ERROR(Device I/O errors)\n at wrapError (/opt/xen-orchestra/packages/xen-api/src/index.js:111:9)\n at getTaskResult (/opt/xen-orchestra/packages/xen-api/src/index.js:189:22)\n at Xapi._addObject (/opt/xen-orchestra/packages/xen-api/src/index.js:796:8)\n at /opt/xen-orchestra/packages/xen-api/src/index.js:832:13\n at arrayEach (/opt/xen-orchestra/node_modules/lodash/_arrayEach.js:15:9)\n at forEach (/opt/xen-orchestra/node_modules/lodash/forEach.js:38:10)\n at Xapi._processEvents (/opt/xen-orchestra/packages/xen-api/src/index.js:827:12)\n at onSuccess (/opt/xen-orchestra/packages/xen-api/src/index.js:850:11)\n at run (/opt/xen-orchestra/node_modules/core-js/modules/es6.promise.js:66:22)\n at /opt/xen-orchestra/node_modules/core-js/modules/es6.promise.js:79:30\n at flush (/opt/xen-orchestra/node_modules/core-js/modules/_microtask.js:18:9)\n at process._tickCallback (internal/process/next_tick.js:112:11)","code":"VDI_IO_ERROR","params":["Device I/O errors"],"url":"https://192.168.222.202/import_raw_vdi/?format=vhd&vdi=OpaqueRef%3Acb285f46-5409-444b-b8b8-0bfdaef5f96e&session_id=OpaqueRef%3Add4a20ae-01cc-4074-98b9-3ce160a0d9e2&task_id=OpaqueRef%3Acb234b28-be6a-4e7d-9437-7a1bccae3646"}]`
    
    backup-ng:
    VM: x-debian9-testXO-600 (Dell-pool)
    tag: CR-debian9-testXO*
    Start: Apr 6, 2018, 9:19:27 AM
    [{"message":"VDI_IO_ERROR(Device I/O errors)","stack":"XapiError: VDI_IO_ERROR(Device I/O errors)\n at wrapError (/opt/xen-orchestra/packages/xen-api/src/index.js:111:9)\n at getTaskResult (/opt/xen-orchestra/packages/xen-api/src/index.js:189:22)\n at Xapi._addObject (/opt/xen-orchestra/packages/xen-api/src/index.js:796:8)\n at /opt/xen-orchestra/packages/xen-api/src/index.js:832:13\n at arrayEach (/opt/xen-orchestra/node_modules/lodash/_arrayEach.js:15:9)\n at forEach (/opt/xen-orchestra/node_modules/lodash/forEach.js:38:10)\n at Xapi._processEvents (/opt/xen-orchestra/packages/xen-api/src/index.js:827:12)\n at onSuccess (/opt/xen-orchestra/packages/xen-api/src/index.js:850:11)\n at run (/opt/xen-orchestra/node_modules/core-js/modules/es6.promise.js:66:22)\n at /opt/xen-orchestra/node_modules/core-js/modules/es6.promise.js:79:30\n at flush (/opt/xen-orchestra/node_modules/core-js/modules/_microtask.js:18:9)\n at process._tickCallback (internal/process/next_tick.js:112:11)","code":"VDI_IO_ERROR","params":["Device I/O errors"],"url":"https://192.168.222.202/import_raw_vdi/?format=vhd&vdi=OpaqueRef%3Afcb4e88f-bc68-4929-b44f-ff90fde9e64c&session_id=OpaqueRef%3Aa09a2f3b-46e1-4d49-bb3b-b29a7ca6ccb0&task_id=OpaqueRef%3Ad1974cf4-5f37-4c7b-b0ba-b21c358f92b8"}]
    


  • Are you using it from the sources or via XOA?



  • hi, I'm using it from sources, its a XO 5.18, my mistake saying XOA previously.




Log in to reply