After all my data was lost again… I’ve now created an backup script that is running on the Synology NAS. This script is inspired by vespaman’s dito. But it had to changed a bit because Synology shell is missing the advanced date-command.
First enable SSH on NAS.
SSH login as root.
Setup SSH keypair between NAS and Vera by running ssh-keygen. Copy the public key to Vera.
/etc/dropbear/authorized_keys
Copy the following script to your Synology NAS. I’ve put my script and backups in /volume1/NAS/HA folder. Please update the red parts below accordingly.
#!/bin/ash
date="`date +%Y-%m-%d`"
destdir=/volume1/NAS/HA
# get the files. Vera does not have rsync, so we use regular scp
scp -qpr root@192.168.178.47:/dataMine $destdir/
# create the backup tar ball
tar -cpzf $destdir/datamine_$date.tar.gz $destdir/dataMine
# fix rights..
chmod 600 $destdir/datamine_$date.tar.gz
# remove tempfiles
rm -rf $destdir/dataMine
# Remove all backups created 2 months ago
find $destdir/datamine_* -mtime +60 -exec rm {} \;[/code]
Hi @hek,
Just a quick note that the data format will change with the next release. I haven’t checked your script, but you’ll need to make sure it will include all subdirectories under the /dataMine directory.
hek,
Can you make it have a cleanup function also? I experience a stoppage of logging after about a month since DM seems to have issues when there are more than about 30-40 files in the folder. Once I deleted the old file DM starts to work again. It would be nice to have it backup daily, and removing any files older than 2 weeks.
I sincerely doubt that dataMine has a problem with 30-40 files! I have nearly 3000 files in my dataMine folder and have no problems. I suspect the biggest issue is with Vera restarts, and the inevitable closing of the USB without properly ejecting the device (which of course isn’t possible in Vera since plugins don’t know that the unit is restarting). However, if Vera restarts often, then the chances are higher that there will be problems.
In any case, the new file structure will mean there are fewer files kept in each directory.
I don’t mean to hijack heks topic, but just to note that if you think keeping your file numbers below 30 to 40, I really don’t think this will help. Your best bet is to use a backup system like hek has described so that you don’t loose data.
Chris,
Would it be possible to have a basic backup feature in DM… providing DM a <share> and a User/PW for it to copy files to a central location? This way anyone, even people without Syno’s, can have backups very easily.
[quote=“Aaron, post:6, topic:175444”]Chris,
Would it be possible to have a basic backup feature in DM… providing DM a <share> and a User/PW for it to copy files to a central location? This way anyone, even people without Syno’s, can have backups very easily.[/quote]
Hi Aaron,
I did think about this, and I’ll have another think about it. The problem I thought of is if you put a backup solution into dataMine, then I need to be careful that it doesn’t impact on other Lua code. If it’s run outside of Lua, then it won’t be an impact (or not much anyway). However, maybe I can spawn something as a background task.
I’ll have a serious think about this as I do agree that something along these lines is highly desirable…
[quote=“Chris, post:7, topic:175444”][quote=“Aaron, post:6, topic:175444”]Chris,
Would it be possible to have a basic backup feature in DM… providing DM a <share> and a User/PW for it to copy files to a central location? This way anyone, even people without Syno’s, can have backups very easily.[/quote]
Hi Aaron,
I did think about this, and I’ll have another think about it. The problem I thought of is if you put a backup solution into dataMine, then I need to be careful that it doesn’t impact on other Lua code. If it’s run outside of Lua, then it won’t be an impact (or not much anyway). However, maybe I can spawn something as a background task.
I’ll have a serious think about this as I do agree that something along these lines is highly desirable…
Cheers
Chris[/quote]
Chris,
Thanks for considering… and now that I think about it - backup should be a separate app.
Not sure if the following ‘logic’ is possible with Vera but…
Each backup job is a separate Device in Vera
Each device contains: source folders (one or more) and one destination folder
Device (backup job) is triggered using existing automation methods… native Vera automation, PLEG events, etc
Hi Aaron,
I think this is making it too complex. You’re now talking about a separate app as a general backup, but I don’t know that there’s lot of other data that needs backing up in Vera. Most Vera data is already backed up in the MCV backup system.
When I get a chance I’ll take a look at some sort of backup for dataMine, but I don’t want to turn it into a general backup tool - at least not right now… Sorry…
I couldn’t make this work until I added the “-i ~/.ssh/keyfilname” to the scp line so that scp knows what keypair to use. Is there some other magic I’m missing that makes things work without that line? A minor detail anyway. The script and instructions are great otherwise.
@hek has been in discussion with MCV about CIFS support and we’ve been testing this in dataMine over the past few weeks. It seems to work pretty well directly logging data to a NAS or Linux server directory through a share. We’ve been running it now for 2 weeks with no problems at all and it potentially avoids issues with poor USB support in Vera/OpenWRT.
In the coming days I’ll try and get a new version of dataMine released which has this support, although you’ll still need to manually install some files but we’ll post instructions.
Excited about the NAS option. I have a good sized Synology NAS, but am not familar with Linux or the SSH functions listed above that are required to do the backup. If I can just point DataMine to a share on my NAS however, and bypass both USB logging and the Synology NAS backup code above, that would work great for me.
The NAS mount/logging has been working flawlessly for a month now! Seems to be a really good solution to all this data problems we’ve seen using USB.
Chris has been pretty busy@work lately and it would be good if he would add re-mount functionality before release (in case NAS goes offline for a long time and Vera unmounts the NAS).
Yes, I’d like to get this added. While I was away we had a power cut and there’s then a race condition if the NAS starts after the Vera, then dataMine won’t mount the drive until Luup restarts. I’ll take a look at this, but otherwise CIFS seems to be a good solution (thanks hek).
Love the idea of using cifs to save to the nas instead of usb stick. I installed the cifs module, but when I try to mount the share, I get the message:
mount error 11 = Resource temporarily unavailable
Refer to the mount.cifs(8) manual page (e.g.man mount.cifs)
mount: mounting //192.168.1.2/veralogs on /nas failed: Resource temporarily unavailable
The share is available (can access from windows), so that isn’t the problem. Does anyone have a suggestion for connecting?
I’ve been running this for a month or two now, and it’s worked fine… The latest version of dataMine has support for this.
Install CIFS as per the info earlier in this thread, and copy all the files from the /dataMine directory on your USB to your NAS. Set the variable SetMountOption to “-t cifs -o user=XXX,pass=YYY”, and SetMountPoint to the NAS share (eg “//192.168.2.10/share”).
dataMine should then mount the NAS on your next startup - in theory it should also unmount your USB - it worked for me, but you should double check this.
The current version won’t attempt to remount, so, if you have a power cut, then if Vera starts before your NAS, you may end up with the mounting failing. To resolve this you’d need to press the restart button in UI5. This will be addressed in the next version.
[quote=“480vac, post:15, topic:175444”]mount error 11 = Resource temporarily unavailable
Refer to the mount.cifs(8) manual page (e.g.man mount.cifs)
mount: mounting //192.168.1.2/veralogs on /nas failed: Resource temporarily unavailable[/quote]
Sorry… really hard to debug remote. Did you solve it? Are you sure that your share really is exposed on /veralogs (without any prefix) and has the correct permissions for the user you are using?
Everything seems OK with the share (at least as far as I can tell from a windows computer). Perhaps I’ll set up a linux virtual machine to test against. I was reading that there are some bugs in the cifs mount code that can cause this. Other linux users suggest using smb mount instead. I don’t think this is supported on vera though.
I’m not sure why my vera does not work, but others do though. I’ll keep trying, and hoping that I figure it out. Otherwise, I’ll just deal with the usb stick.
Thanks so much for the CIF instructions, i was getting so frustrated with my keys failing.
Are these instructions going to be posted more widely on the forum, such integration will help in other ways too?
I’ve just mapped/mount to a share on my NAS, however it would not let me map to a newly created sub folder within that share, any ideas what might restrict it? Does the folder need certain privileges? What privileges should it have do you know? (I’m still new to Linux, is that a cmod command?)
Also 2 other thoughts/questions
How do I remove a mounted CIF drive, if I did it wrong ?
Now the data is stored outside Vera are there any plans to create a standalone DataMine app?