If you have a big file that you want to send to hundreds of computers, you may use copy or xcopy commands but in certain scenario, it may not work. For example, on slow internet, when you start copying the file and for whatsoever reason the copy fails, the (corrputed) file with exact size still sits on the destination computer even though the copy failed. It gives you false sense of completion of copy command but in fact it was not. Here is another way (and there are many more ways like MD5 hash check) to copy the file and verify its integrity.
- Create a zip file of a big file (even if you don’t need the zip file) and split it into multiple files
- Copy whole folder to the destination computer
- Run following bat file directly on destination computer (but not from source computer otherwise, it is same as copying the whole file to destination computer). If the file is extracted successfully it will generate “success.txt” file otherwise it will generate “error.txt” file. You can also write your own success/error notification based on your need.
"C:\EXE_LOCATION\7za.exe" e -o"C:\MY_FOLDER" "C:\ZIP_FILE_LOCATION\Release4.zip.001" if errorlevel 1 ( echo. >NUL 2>"C:\MY_FOLDER\error.txt" ) else ( rem Write your own code once the file is successfully extracted echo. >NUL 2>"C:\MY_FOLDER\success.txt" )
- Copy the big file into chunks. You may control the network flood gate.
- Extracting from zip file internally requires all original bytes to be present. If there is an issue with regular copy/xcopy command, unzip will not work and hence it automatically gives you an integrity check.
You may get the commands for splitting the zip file and combining from here.