Advertisement
CrashPlan Community Forum – Support & Assistance

CrashPlan PRO / CrashPlan for Small Business using Docker on Synology (March/2018)

Affiliate Links

Docker has become a very good solution, easy to implement, to get CrashPlan PRO running without all the fuss of the adapted SPK package made available and maintained by the awesome patters.

There are multiple advantages of using Docker:

  • CrashPlan PRO’s install is not considered headless (you can access it via web GUI or a VNC client)
  • It should handle updates somewhat more smoothly.
  • Docker allows you to limit resources attributed to the container, which will (hopefully) make CrashPlan’s resource consumption more manageable.  Synology’s linux distribution does not allow resource limitation, so that feature is useless. So you’re only able to limit the RAM usage by using the docker environ variable, as explained in the guide below.

 

I’m opting to try this out on my 2Gb DS412+ instead of using CrashPlan Pro 4.9 with blocked upgrades because I guessing it’s not behaving very well (backups not being properly done).

Here are the instructions, assuming you have SSH access enabled:

  1. Open Package Center, search for “Docker“, install the package and make sure it is running.
  2. A Docker icon will now be found in DSM’s launcher and you might want to drag it onto the desktop. Launch the app.

  3. Press Registry and enter “CrashPlan” in the Search box. Look for “jlesage/crashplan-pro“, select it, and press the Download button.

  4. A notification bubble will appear in the left side menu, under Image, where you will be able to see the download progress.
  5. Open File Station, go to the docker shared folder, and create appdata/crashplan folders inside

    Addendum: Note that the reason why we’re using SSH from hereon is because we want the container to access volume1 and that cannot be done (AFAIK) from DSM. But you can use the Docker app for creating the container using the applicable recommendations below, and adding the paths accessible via DSM that you need to make available to CrashPlan.

  6. Access your Synology via SSH, and issue sudo -i for root access. From here:
    1. You’ll need to initialise the Docker instance, by setting some parameters. The complete command list is available at the docker package github page. Here’s what I’m using:
      1. -e USER_ID=0 -e GROUP_ID=0 : user is root
      2. -e CRASHPLAN_SRV_MAX_MEM=1600M : sets the Java max heap size to 1600MB (change as necessary, using M for Mb or G for Gb as a suffix)
      3. -e SECURE_CONNECTION=1: for HTTPS access to the GUI, remove if you plan on using just HTTP
      4. -e VNC_PASSWORD=[YOUR-GUI-PASSWORD-HERE]: Use if you want a password to access the GUI
      5. --restart always: makes your container restart if the NAS restarts.
      6. -p 5800:5800: port to be used for GUI web interface access
      7. -p 5900:5900: por for GUI web interface access via VNC protocol (unnecessary if not used).
    2. Here’s the command to be issued with all the previous parameters considered:
      docker run -d \
      --name=crashplan-pro \
      -e USER_ID=0 -e GROUP_ID=0 \
      -e CRASHPLAN_SRV_MAX_MEM=1600M \
      -e SECURE_CONNECTION=1 \
      -p 5800:5800 \
      -p 5900:5900 \
      -v /volume1/docker/appdata/crashplan:/config:rw \
      -v /volume1/:/volume1:ro \
      --restart always \
      jlesage/crashplan-pro
    3. If it runs properly you should see an hash string and the container should now be running on the Docker DSM interface.
  7. You can now go to your browser and open your CrashPlan web GUI interface to finish the setup (notice the menu bar for Copy/Paste if needed):
    https://your-local-nas-url:5800/
    or http:// if you have not used SECURE_CONNECTION=1
  8. After you sign-in, remember you’ll need to adopt the last backup, to resume backup normally. This will be made as an option right after you login through the web GUI interface, and it will be in the fashion of Replacing an Existing Device. and then choosing Skip Step 2 – File Transfer . After this is done CrashPlan may state it cannot find the locations for backed up folders (in my case it worked just fine as is), and here’s where you will point those to their location in the NAS.

 

One of the nice touches this solutions adds is that you can now easily see what the actual RAM usage is by the CrashPlan instance from the DSM Docker interface, in a much nicer way.

 

Final Notes:

  • The file / folder area of the GUI web interface is scrollable, and you probably need to scroll down before finding “volume1
  • You can rename your backup sets and device as well via the GUI.
  • To access the internal command line press (full command list here): Ctrl+Shift+C
  • In my personal case I had to launch backup.scan so that the file selection was actually populated.

Upgrading the Container:

  • If you’re looking on how to upgrade your Docker container and how to keep updated on container updates check this article here.

 

Optimisations:

  • There’s a known issue for very large file sets, where you have to edit the/etc.defaults/sysctl.conf file to allow for more than the default setting (1.048.576 files). This file is probably going to be reset with DSM upgrades. Add this to the file:
    fs.inotify.max_user_watches=1048576
  • A possible solution for this is to create a new Scheduled Task in DSM to make add this line to that file

 

 

 

 

Advertisement

44 comments for “CrashPlan PRO / CrashPlan for Small Business using Docker on Synology (March/2018)

  1. Mark Seidler
    March 8, 2018 at 11:46

    Thank you for the article. The description also helped me a lot. Unfortunately I still have a problem.
    I got to point 6.
    From point 7 I have probelem. Although I see that the “CrashPlan web GUI interface” but I can not log in.
    See photo in the link:
    Can anybody help me?
    https://www.dropbox.com/s/oqt0mgf3wze9i8z/Ohne%20Titel.jpeg?dl=0

    • an
      March 8, 2018 at 11:48

      That’s weird – are you sure you’re using the same login details that give you access to crashplanpro.com?
      I’m not seeing any reason for that to be happening and I would tend to think that it’s related to Code42 and not your setup…

      • Mark Seidler
        March 9, 2018 at 14:15

        Yes, on the page crashplan.com the login works.

        • an
          March 9, 2018 at 14:41

          Have you disabled the old package…? I’m really not seeing what’s wrong

          • Mark Seidler
            March 9, 2018 at 14:47

            Should I deinstall the package Crashplan Pro from the package center?

          • an
            March 9, 2018 at 15:01

            I think it should be disabled, not uninstalled.

          • Mark Seidler
            March 9, 2018 at 15:08

            it is stopped.
            how can i disable it

          • an
            March 9, 2018 at 15:10

            That’s the same thing – that’s not the issue, for sure.
            Did you create the “appdata/crashplan” folders in the “docker” shared folder?

          • an
            March 9, 2018 at 15:12

            And if so, make sure the folders are empty.
            [from: https://github.com/jlesage/docker-crashplan-pro/issues/10 ]

          • Mark Seidler
            March 9, 2018 at 15:20

            Yes, I have created the folder. Folder and file have apparently been opened in this folder automatically

          • an
            March 9, 2018 at 15:22

            Stop the docker package, rename the “crashplan” folder as a backup and recreate it, making sure it is empty. It might be the case that there’s a file there from a previous setup attempt of yours.

          • Mark Seidler
            March 9, 2018 at 22:21

            I do not get it. I did everything exactly as you said. but it does not work. I even reinstalled everything. Do you have another tip?

          • an
            March 9, 2018 at 22:25

            Have you looked at the github link I posted earlier? https://github.com/jlesage/docker-crashplan-pro/issues/10
            I suggest you follow the tips there for checking the service logs and see if there’s something there, and if not post a question over there, it’s probably the best place to do so… I really can’t say what could it be, but do get back if you fix it!

          • an
            March 9, 2018 at 23:08

            A final suggestion – try following this guide below and see if there’s anything different from this article:
            https://github.com/jlesage/docker-crashplan-pro/issues/34#issuecomment-354888938

          • Mark Seidler
            March 9, 2018 at 14:58
  2. Dave Cleminson
    March 10, 2018 at 01:17

    Hi All,
    Worked like a charm for me….
    Followed the instructions.
    Took me half an hour including the initialising & scanning for my 1TB+ backup.
    Just completed the backup to get everything up to date..

    Thanks for the great article.

    DC

  3. March 30, 2018 at 06:05

    Hi! That seems awesome but it doesn’t work yet for me. I can see the CrashPlan Web GUI, but I can’t login to CrashPlan with it, I am getting: “Unable to sign in. Can’t connect to server.”. So I guess it can’t connect to CrashPlan servers? But of course my Synology has access to Internet. What could it be? Thanks 🙂

  4. rbmanian75
    April 3, 2018 at 08:59

    I have installed the container and it works fine. But now i want to remove the SECURE_CONNECTION. How to modify the container? Is it possible to do it without deleting the container and recreating it?

    • an
      April 3, 2018 at 09:09

      You should be able by (stopping and) editing the container from Docker itself

  5. Mohd Shahid
    April 24, 2018 at 13:52

    Hi, may I know if this works with Crashplan client 6.7.1 on Mac? For some reason my Synology NAS running on Docker is still backing up but I can’t seem to be able to add more folders on the NAS to backup but I can access the backed up files.

    • an
      April 24, 2018 at 22:59

      I’m sorry, but I don’t understand what the issue is… Is CrashPlan running on your Synology, under Docker…? If so, you don’t need to use the Mac client, you’ll access CrashPlan via browser.

      • Mohd Shahid
        April 28, 2018 at 17:08

        My bad, yes its running under Docker but it’s a different Crashplan pro docker image. Added your docker, followed the instructions and all working fine with backup running.

        Only problem is when I select Manage Files to add more folders nothing new can be seen it’s only showing those folders I’ve backed up before. Volume 1 has many other folders and subfolders but nothing else is showing. I will delete the old docker container and downloaded image file and retry once my backup is complete.

        Do you have any other ideas though?

        • an
          May 2, 2018 at 14:14

          This might be a stretch, but are you sure you’re just not scrolling the folder window? I’m saying this because it’s not obvious at all that the window section does scroll down and revealing the rest of the folders.
          Other than that, if you’re seeing something but not everything, then it might be something wrong with the user permissions you considered (in the instructions above we’re using root, see item 6.1.1.).

          • Mohd Shahid
            May 7, 2018 at 14:11

            Yup followed your instructions to a “T”. I’ve scrolled and clicked volume 1 and can see only my 2 original folders and nothing else in it. When I first set up my Synology NAS, I disabled the admin account and created my own account which is given admin privileges. I did an ID of my account and the User ID and Group ID is 1026 and 100 respectively.

            I’ve stopped the container and updated the UID and GID environment variables via the docker GUI(and removed a number of repeated e environment variables), restarted and no luck still. Just not sure why it cannot display my home folder still and it is backing up 0 bytes even though I know there are new things to backup.

            Any other ideas?

          • an
            May 11, 2018 at 11:42

            I would suggest you to try using the exact user id and group id from the instructions to have root access, and work from there.

  6. Jason de la Cruz
    May 5, 2018 at 20:53

    Thanks, this worked! The UI is painfully slow on my 1515+, but it certainly works!

    • Mohd Shahid
      May 11, 2018 at 03:40

      Restart the Synology itself after all the processes are complete and it will improve. There will always be a slight lag but certainly faster that when you first used it.

  7. Stuart Fagg
    May 29, 2018 at 08:50

    Thank you so much, your instructions were clear & worked fine. I’d almost resigned myself to abandoning CP before this. Quick query on optimisations, the “fs.inotify.max_user_watches=1048576”, should I increase that value or does adding the line effectively remove the file limit?

    • an
      May 30, 2018 at 12:54

      Thanks – I can’t really say for sure if that line actually does any benefit at this time or not… If it does, that value should take care of it.

  8. greenwell
    June 5, 2018 at 09:13

    Hi,

    I followed the instructions exactly but doenst pick up the -v /volume1/:/volume1:ro, hence I cannot select files to be backed up. It can see the /volume1/docker/appdata/crashplan:/config:rw

    I have also changed to root whilst executing the command

    Any idea?

    • an
      June 5, 2018 at 09:33

      Hmm – maybe you’re trying use another user_id/group_id other than root’s?
      “-e USER_ID=0 -e GROUP_ID=0 “

      • greenwell
        June 5, 2018 at 09:39

        Hi, thanks for getting back to me. I have used “-e USER_ID=0 -e GROUP_ID=0 ” with 0 and 1000 but it makes no difference.

        • an
          June 5, 2018 at 11:31

          Can you elaborate on how do you come to the conclusion that it “doesn’t pick up the -v /volume1/:/volume1:ro”?

  9. July 10, 2018 at 00:24

    Thank you very much for this article, worked on my first attempt.

  10. Jeff Bartels
    September 21, 2018 at 14:30

    I have CPP running in Docker – attached to a backup (@4Tb) of a 16Tb Synology NAS that was already on Crashplan. When I check on the backup using browser GUI (https://…:5800), it shows 143Tb of selected files! Cant figure out why. Maybe it has something to do with the way I created the CPP container? Originally, EVERYTHING from the Linux root (‘/’) on down was selected – I unselected everything but volume2 (where all of my user data is stored). Any suggestions on where to start? I don’t have 53 years to wait for this backup to complete LOL

  11. Kay Günther
    November 21, 2018 at 14:12

    docker: Error response from daemon: Bind mount failed: ‘/volume1/docker/appdata/crashplan’ does not exists.

    The error comes after step 6.2.

    • an
      November 21, 2018 at 14:23

      Hi – check your step 5, those are folders you have to create yourself… Check if none of the characters are capital or anything as such.
      Regards

      • Kay Günther
        November 21, 2018 at 14:42

        Thank you for your quick answer. My folders name was “crashplan-pro”.:D. I repeat the steps with the folder name “crashplan”. The docker container is now running, but i cannot access the gui?
        I come from crashplan 4.9 installed with the java version 151. Is that java version too old ? Both are still installed.

        • an
          November 21, 2018 at 14:46

          Nice. Just follow the steps – step 7 shows how to access the gui.
          You should also update your Java installation.
          I think it’s best to at least stop all processes from crashplan 4.9 and after testing SMB just uninstall it altogether.

          • Kay Günther
            November 21, 2018 at 15:19

            Ok all works fine. My last problem is the inotify watch limit. I add to the file “fs.inotify.max_user_watches=1048576” but the notification in the gui still appears.

          • an
            November 21, 2018 at 16:02

            What notification? That setting is about something that should not be visible at all in the gui.

          • Kay Günther
            November 22, 2018 at 12:56

            The notification is gone. My last problem is that crashplan says i have over 140 tb of data?
            do you have a solution for it or i need to delete my backup and download it again?

          • an
            November 25, 2018 at 00:03

            I have no idea if that is accurate or not; you should check what dataset is being selected for backup.
            Cheers!

Leave a Reply

Here's what other Users are reading:

Affiliate Links
Advertisement