Python Need an automatic way to move files between computers

AI Thread Summary
To automatically move files between three PCs, several methods can be employed. Using cloud services like Google Drive is not feasible due to offline requirements, so alternatives include setting up a shared drive on a master PC, utilizing Windows commands like NET SHARE/NET USE, or employing Robocopy for file monitoring and copying. For Linux systems, SCP can be used, although it lacks the ability to check file versions automatically. A more robust solution involves creating a script that compares timestamps to ensure only the latest files are transferred, or using a version control system like Git for synchronization across different operating systems. The discussion emphasizes the need for a reliable method to maintain file consistency across all machines.
nashed
Messages
56
Reaction score
5
I have a system of 3 PCs all connected to the same router, I need some way to automatically move files from two PCs to the third, how do I even begin to approach this?
 
Technology news on Phys.org
What do you mean by automatic?
 
Not at all anything I'd be familiar; but would OneDrive, or DropBox, or GoogleDrive be useful for this?
 
Google drive could work and would automatically keep all three machines in sync.

A more traditional approach would be to setup a shared drive on one machine allowing the other two to monitor the shared drive with a simple program to copy over files on demand.
 
  • Like
Likes Nidum and FactChecker
If you are on Windows:

The Robocopy.exe that comes with Windows can monitor for and copy files. Monitoring can be on a Time basis or Number of Changed files.

Another approach is to write a .BAT (Batch) file that invokes the Xcopy utility with the appropriate options. Then use the Scheduler to periodically invoke the .Batch file.
 
  • Like
Likes jedishrfu
Thanks for the replies everyone, here's the deal though, I'm working on a system that would need to sit in a remote location and will contain sensitive enough files that we'd rather have it offline, meaning that google drive and one drive aren't really an option.

A further explanation of the system: I"ll have an X number of PCs (with Windows 10) probably somewhere around 6 or 7, these PCs are operating a Kinect eacj and will receive point clouds from them, I want to send the point clouds to a central PC which will do the processing on them and I need the operation to work automatically.
 
  • Like
Likes jedishrfu
Why don't you just share a disk on the "master" PC? See the NET SHARE/NET USE commands. Then have all the PCs write to it. This will work so long as you don't also have a need to duplicate file names.
 
  • Like
Likes jedishrfu and FactChecker
Security/encryption is presumably not an issue if you're isolated from the net, so ftp or similar?
 
  • Like
Likes jedishrfu
  • #10
Vanadium 50 said:
Why don't you just share a disk on the "master" PC? See the NET SHARE/NET USE commands. Then have all the PCs write to it. This will work so long as you don't also have a need to duplicate file names.
Folks have forgotten about the good old days when you could actually share a drive with another computer.
 
  • Like
Likes FactChecker
  • #11
If this is just for backup purposes perhaps you can just share the drive, map it to a drive letter an use Windows 10 backup. I've not actually tried this. I use a NAS drive.
 
  • Like
Likes jedishrfu
  • #12
If they are all linux (and even if they are not), we can use the ssh suite of utilities that are included by default.
More specifically 'scp' (secure copy) will copy files or directories securely from one computer to the other.
 
  • Like
Likes jedishrfu
  • #13
I like Serena said:
If they are all linux (and even if they are not), we can use the ssh suite of utilities that are included by default.
More specifically 'scp' (secure copy) will copy files or directories securely from one computer to the other.

I think what the OP is looking for is a means to maintain the files at the same level on each machine meaning if it’s changed on one then it gets replicated onto other machines. Using scp will copy the files to or from another machine but it can’t check whether it’s newer or older than the target and so may copy an older version onto a newer version.

As others have mentioned you should setup a master copy on one machine and have each machine run a script to do timestamp compares and decide whether to copy the file to the local machine or not.
 
  • #14
jedishrfu said:
I think what the OP is looking for is a means to maintain the files at the same level on each machine meaning if it’s changed on one then it gets replicated onto other machines. Using scp will copy the files to or from another machine but it can’t check whether it’s newer or older than the target and so may copy an older version onto a newer version.

As others have mentioned you should setup a master copy on one machine and have each machine run a script to do timestamp compares and decide whether to copy the file to the local machine or not.
Nope. OP asked specifically to automatically move files from 2 PCs to a third.
Anyway, I see that OP wrote that he has Windows, which means that scp still works, but is somewhat awkward to configure.
 
  • #15
Traditional data feeds use a scheduled job and FTP servers and clients. Files usually move to holding area on receiving end overnight. A subsequent scheduled job will grab received file and do something with it. If all machines on intranet together, this could work.
 
  • #16
I would connect all three machines to each other inside of a VPN, then use git repository that they all draw from. I'd put a cron in place that would run a git update every minute or so. Git would mean it would not matter what type of machine any of the hosts are (you can have two windows machines and a Redhat one for example.) Git would also allow me to do things like tags, branches, and version history.

But first, is this literally only for backup? There are better solutions for that specific problem (like RAID.)
 
  • Like
Likes jedishrfu
Back
Top