Quantcast
Channel: 2BrightSparks
Viewing all 9303 articles
Browse latest View live

SyncBackFree (freeware) • Re: Error when trying to use FTP

$
0
0
My problem is: I use the "portable" version because on the machine I would like to run it I dont have admisistrative privileges. But without admiinistrative privileges I can't even register the dll manually. Is there a workaround for it?

Statistics: Posted by bragotto — Thu Mar 05, 2020 8:34 am



SyncBackPro (commercial) • Missing a function "delet on exist"

$
0
0
Hi, look at this mixed Mirror/Backup scenario:

Let SOURCE0 be a source directory with many subdirectories and files and
Mirror-mx it's monthly backup
Then is:
Mirror-m0 an exact copy of SOURCE0 at time m0
Mirror-m1 an exact copy of SOURCE0 at time m1, Mirror-m1 replaces Mirror-m0
Mirror-m2 an exact copy of SOURCE0 at time m2, Mirror-m2 replaces Mirror-m1
...

At the turn of the year, SOURCE is cleaned up and any scrap is deleted (unfortunately mostly a few files too many). This turns SOURCE0 into SOURCE1. To have a backup, Mirror-m12 is frozen and SOURCE1 is added as Mirror-Y0m12 to new SOURCE1.

Monthly mirroring will continue in the coming year
Mirror-m0 now contains an exact copy of SOURCE1 at Y1m1 incl. Mirror-Y0m12
Mirror-m1 now contains an exact copy of SOURCE1 at Y1m2 incl. Mirror-Y0m12
...

Since Mirror-Y0m12 no longer changes, this does not burden the mirroring process, but increases the memory required for mirroring. The storage needed would decrease significantly if all files in Mirror-Y0m12 are deleted at the turn of the year that are still unchanged in SOURCE2.

How can this process "delete on exist" be mapped with SyncBackPro?

Thanks - and a good health all over the ball
Keltas

Statistics: Posted by keltas — Fri Mar 06, 2020 1:39 pm


SyncBackPro (commercial) • Re: Missing a function "delet on exist"

$
0
0
... sorry there may be some missleading wording (Don't konw how to get my posting changed).

replace "To have a backup, Mirror-m12 is frozen and SOURCE1 is added as Mirror-Y0m12 to new SOURCE1"
with "To have a backup, Mirror-m12 is frozen and added as Mirror-Y0m12 to SOURCE to become a new SOURCE1"
Best
Keltas

Statistics: Posted by keltas — Fri Mar 06, 2020 1:56 pm


SyncBackFree (freeware) • SBF opens minimized

$
0
0
Win 10 SBF opens minimized if last closed from the taskbar. It looks like the program failed to start but it is actually just minimized to the taskbar immediately. Kinda strange, I don't recall any other program remembering it was minimized when closed.

Statistics: Posted by mmullins_98 — Sat Mar 07, 2020 11:34 am


SyncBackFree (freeware) • keep getting this error with latrest update

$
0
0
Failed to scan files: Internal Error: ScanTree exception: Out of memory (Destination)

Anyone j=know how to fix it?

Statistics: Posted by jbm007 — Sun Mar 08, 2020 4:50 pm


SyncBackPro (commercial) • Re: Windows Standby RAM + Verify that files are copied corre

$
0
0
So it seems like SyncBackPro in the verification process just uses the file from Windows Standby RAM and doesn't read back the file from the destination HD which then defeats the purpose of the check?
Is this still true (especially of Version 9)?

Statistics: Posted by BMWBig6 — Sun Mar 08, 2020 8:56 pm


SyncBackPro (commercial) • Two speed questions, all help welcome!

$
0
0
I'm on a Win10 box (xeon 2246 6 core 12 thread, 32GRAM)

1) MULTI-CORE SPEED.Is there a trick to make syncback pull higher priority?
I ran a test internal-to-internal folder backup. I used level 9 compression, individual zip files, and 256 bit encryption. It was obviously not using much of the available CPU: I am only at ~25% load, w/ over 20G of free memory, and this is an SSD raid so I know it isn't the disk which is holding things up. Some of the tasks I do involve quick backups to portable drives. Any tips to make it use more available CPU when it's maximized?

2) AMAZON s3 NETWORK SPEED TIPS. Backing up encrypted and compressed mixed files (docs, photos, etc.) to Amazon S3 (not a compatible service; directly to s3) over a Comcast network. Backups are going to happen in the evening while I'm away.

My "Performance" window says the following are slowing it down:
Compression - Compression (This doesn't seem to be doing much, unless I'm missing something, because even a level 9 internal test is about 10-20x faster than the demonstrated upload speeds.)
Cloud - Amazon S3 (I assume this just means "slowing down as compared to network disk....?)
Simple - File and Folder Selection (again, not sure what this is.)

I am using default threads. I am getting only a couple of M per second, though my upload speed is usually higher.

Any tips?

Statistics: Posted by hammarlundlaw — Sun Mar 08, 2020 10:13 pm


SyncBackPro (commercial) • Re: Two speed questions, all help welcome!

$
0
0
Also: I know I can let this run overnight, and then stop it when I return. When I start it again, the already-uploaded files will be on the S3 server, so (other than the wasted resources to run a scan) it will resume. But is there a better way, for example a "pause and resume later" function?

Statistics: Posted by hammarlundlaw — Sun Mar 08, 2020 10:19 pm



SyncBackPro (commercial) • Re: S3 Glacier Deep Archive

$
0
0
Hi,

Note SyncBackPro V8 supports file backup to Glacier through Amazon S3 (not directly to Glacier). Once the files are copied to S3, Amazon does the pushing automatically from S3 to Glacier based on the lifecycle rules set in S3. Therefore, you can use SyncBackPro V8 to backup files to S3 Glacier Deep Archive. For additional details, please read Amazon Glacier article.

However, it’s not possible to restore files from Glacier Deep Archive using V8. This feature will be supported in the next major version (V9). SyncBack V9 is still a work-in-progress and we have no ETA on when it will be available.

Thank you.
Can SyncBackpro backup to AWS S3 Glacier or Glacier Deep Archive now?

While we can use lifecycle to move files from S3 to Glacier or Glacier Deep Archive, it would be best if we can store data in S3 Glacier Deep Archive directly.

The below is extracted from AWS website:

"Q: How do I get started using S3 Glacier Deep Archive?

The easiest way to store data in S3 Glacier Deep Archive is to use the S3 API to upload data directly. Just specify “S3 Glacier Deep Archive” as the storage class. You can accomplish this using the AWS Management Console, S3 REST API, AWS SDKs, or AWS Command Line Interface. "

Statistics: Posted by tonylim — Mon Mar 09, 2020 3:18 am


SyncBackPro (commercial) • Re: Windows Standby RAM + Verify that files are copied correctly

$
0
0
Hi,

You can tick the option 'Stop Windows from caching a files contents when it is copied' under: Modify profile > Expert > Copy/Delete > Advanced page, and then SyncBack will tell Windows not to cache the contents of the file in memory. But if the drive, network or anything else is caching then its beyond our control.

Thank you.

Statistics: Posted by Swapna — Mon Mar 09, 2020 7:04 am


SyncBackFree (freeware) • Re: keep getting this error with latrest update

$
0
0
Unhandled exception at 0x004BB2F9 in SyncBackFree.exe: 0xC0000005: Access violation writing location 0x06B10FDC.

Because its free they don't have to fix it?

Somebody there needs to run their debugging software.

Statistics: Posted by jbm007 — Mon Mar 09, 2020 1:31 pm


SyncBackPro (commercial) • Amazon Drive

$
0
0
Hi,
whic version of SyncBackPro supports amazon drive ?
Now i use SyncBackPro V9.3.4.0 and the function is disable, i have to upload the photo in Amazon Photo.

Thanks :)
Roberto

Statistics: Posted by rcogni — Tue Mar 10, 2020 7:24 am


SyncBackFree (freeware) • Re: SyncBackFree V9.0.6.5 -- Clearing Notifications

$
0
0
Hello
I came onto the forum for this very reason. It drives me nuts.
There you are telling SyncBack what to do and everything get needlessly duplicated in Window notifications, where the list gets longer and longer. How do you stop this from happening?
Thanks

Statistics: Posted by SafeTex — Tue Mar 10, 2020 9:14 am


SyncBackPro (commercial) • FEATURE REQUEST: Folder-level compressed file creation

$
0
0
Right now, SB Pro only has a "per file" and "one huge single file" zip option.

"per file" is costly for some services which charge fees on a per-file basis. But "one huge single file" is a real pain when the files have only changed in a few folders, and you need to re-upload the entire thing.

it would be GREAT to have a middle ground. Please add a "per folder" / "per-subfolder" option to the .zip compression

This will allow folks to run incremental or fast backups of large volumes, while replacing only the folder(s) that have been changed since the last backup. Also, since all users are fast at working w/ folders, this makes it really easy to tweak syncback.

Statistics: Posted by hammarlundlaw — Wed Mar 11, 2020 7:50 pm


SyncBackPro (commercial) • Backup only if larger, or force never-copy-to-left

$
0
0
I'm looking to automate (Unattended) this functionality; is anyone aware of what combination of settings I can use to meet this goal?

Files only on Left are copied to Right
If a file is only on Right then it is ignored
If the same file has been changed on both Left and Right, then 'copy to Right if Left is larger'
(i.e. Do not copy from Right to Left under any circumstances; the "Left" side should not be changed.)

I am able to do this with a manual Run and manually setting all Copy-to-Left to Skip.

TIA.

Statistics: Posted by edc — Wed Mar 11, 2020 6:22 pm



SyncBackPro (commercial) • Re: FEATURE REQUEST: Folder-level compressed file creation

$
0
0
+1

Also, encrypting folder names would be good.

Statistics: Posted by cowelln — Wed Mar 11, 2020 9:58 pm


SyncBackPro (commercial) • Profile Start Time of Child profile log

$
0
0
I have a Group Profile with say, 5 Child/Sub profile. When I look at the log of the Child/Sub profile, the profile start time is always the Group profile start time. Thus the profile end time of say 5th Child/Sub profile appears to be very long, since it calculated from the Group profile (or 1st Child/Sub profile) is run.

Since a log file is generated for each Group Profile and each Child/Sub profile, is there anyway I can set such that the log file of the each Child/Sub profile start time is the actual start time of each Child/Sub profile?

Statistics: Posted by tonylim — Fri Mar 13, 2020 12:50 am


SyncBackPro (commercial) • Scheduling every xx minutes

$
0
0
Working on a Win 10 computer, as the destination, connecting to a win 7 computer which currently hosts the folder I wish to copy every 30 minutes. I am using SyncBackPro V 7

How can I set up a profile in the scheduler to run every 30 minute?
It is set up to Run this profile every 0 days 0 hours 30 minutes 0 seconds
for a duration of 30 days 30 and click OK

The schedule then appears with
Schedule
Every 1 days, repeating every 15 ninutes for a duration of 30 days,

If I click ok ion t, the project will not run until the following day., then I suppose -- once per day for the next month

How can I set it up to run every 30 minute, beginning immediately?

Than you for your help.

Peter

Statistics: Posted by PeterDr — Fri Mar 13, 2020 1:32 am


SyncBackPro (commercial) • Re: Profile Start Time of Child profile log

$
0
0
Hi,

Sorry not possible. By default when you run a group, SyncBack starts all the child profiles of the group and pause them. Then the profiles in the group are executed in sequential order (from first to last profile). Thus, the start time of the child profiles are same as the group start time.

Thank you.

Statistics: Posted by Swapna — Fri Mar 13, 2020 5:28 am


SyncBackPro (commercial) • Re: Scheduling every xx minutes

$
0
0
Hi,

Configure the profile to Recur every [1] day and Repeat setting as:

Run this Profile every : [0] days, [0] hour, [30] minutes, [0] seconds

for a duration of : [0] days, [23] hours, [30] minutes, [0] seconds

Do not enable the 'Indefinitely' option.

Thank you

Statistics: Posted by Swapna — Fri Mar 13, 2020 5:35 am


Viewing all 9303 articles
Browse latest View live