Is it possible to make SCP ignore symbolic links during copy?

Scp

Scp Problem Overview


I need to reinstall one of ours servers, and as a precaution, I want to move /home, /etc, /opt, and /Services to backup server.

However, I have a problem: because of plenty of symbolic links a lot of files are copied multiple times.

Is it possible to make scp ignore the symbolic links (or actually to copy link as a link not as a directory or file)? If not, is there another way to do it?

Scp Solutions


Solution 1 - Scp

I knew that it was possible, I just took wrong tool. I did it with rsync

rsync --progress -avhe ssh /usr/local/  XXX.XXX.XXX.XXX:/BackUp/usr/local/

Solution 2 - Scp

I found that the rsync method did not work for me, however I found an alternative that did work on this website (www.docstore.mik.ua/orelly).

Specifically section 7.5.3 of "O'Reilly: SSH: The Secure Shell. The Definitive Guide".

> 7.5.3. Recursive Copy of Directories > > ... > > Although scp can copy directories, it isn't necessarily the best method. If your directory contains hard links or soft links, they won't be duplicated. Links are copied as plain files (the link targets), and worse, circular directory links cause scp1 to loop indefinitely. (scp2 detects symbolic links and copies their targets instead.) Other types of special files, such as named pipes, also aren't copied correctly.A better solution is to use tar, which handles special files correctly, and send it to the remote machine to be untarred, via SSH:

$ tar cf - /usr/local/bin | ssh server.example.com tar xf -

Solution 3 - Scp

Using tar over ssh as both sender and receiver does the trick as well:

cd $DEST_DIR
ssh user@remote-host 'cd $REMOTE_SRC_DIR; tar cf - ./' | tar xvf -

Solution 4 - Scp

One solution is to use a shell pipe. I have a situation where I got some *.gz files and symbolic links generated by some software to link to the same *.gz files with a slightly shorter name. If I simply use scp, then the symbolic links will be copied as regular files and resulting in duplicates. I know rsync can ignore symbolic links, but my gz files are not compressed with syncable options, and sync is very slow in copying these gz files. So I simply use the following script to copy over the files:

find . -type f -exec scp {} target_host:/directory/name/data \;

The -f option will only find regular files and ignore symbolic links. You need to give this command on the source host. Hope this may help some user in my situation. Let me know if I missed anything.

Solution 5 - Scp

A one liner solution which can be executed at client to copy folder from server using tar + ssh command.

ssh user@<Server IP/link> 'mkdir -p <Remote destination directory;cd <Remote destination directory>; tar cf - ./' | tar xf - C <Source destination directory>

Note: mkdir is must, if the remote destination directory is not present then the command will simply compress the entire home of the remote server and extract it to client.

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionKris_RView Question on Stackoverflow
Solution 1 - ScpKris_RView Answer on Stackoverflow
Solution 2 - ScpLukeView Answer on Stackoverflow
Solution 3 - ScpVojtech VitekView Answer on Stackoverflow
Solution 4 - ScpKemin ZhouView Answer on Stackoverflow
Solution 5 - ScpVinay TiwaryView Answer on Stackoverflow