I am a heavy Git user, I use it for mostly... almost everything! I have a lot of repos on my machine, connected to various online accounts (github, gitlab, but also others). Most of these are in kept in some dedicated folder (say /home/myname/dev
, for example).
Now, I have a new computer. How do I import all these repos at once? Of course, I don't want having to clone them one by one manually! That would be ok for 1,2 or 3 repos, but I've got dozens.
So I just wrote these 2 scripts to 1-generate a list of remotes in a text file (on the "old" machine), and 2-automate cloning from this file on the new computer.
(Finding something similar on SO or elsewhere seems incredibly hard, thus I rewrote that, probably not the first one...)
First, on the old computer, drop the following script in the folder holding the repos, and run it:
# git_generate_url_list.sh # Generate a list of git remotes that are in the current folder # (also logs their sizes) # S. Kramm - 2020-01-04 #!/bin/bash a=$(ls -1d */) echo "# repos list" > url_list.txt echo "# repos size" > repos_size.txt for i in $a do echo "Processing $i" du $i -hs >> repos_size.txt cd $i; git remote get-url --all origin >> ../url_list.txt cd .. done
This also assesses the sizes of each of these, can be useful to detect something going wrong...
Then, take that url list file on new computer, drop it in /home/myname/dev
(or whatever location), along with this second script, and run it:
# git_clone_from_url_list.sh # clone in current folder from a set of urls, from file given as argument # S. Kramm - 2020-01-04 #!/bin/bash if [ "$1" == "" ] then echo "Missing filename!" exit 1 fi echo "git cloning from file $1" while read a do if [[ ${a:0:1} != "#" ]] # if not a comment then echo "importing repo from $a" git clone $a fi done < $1
Of course, if you use https, you will need to provide passwords for the private repos, but only once per online service.
Edit: also checkout the other post about this topic.