In web application development, it happens sometimes that you are stuck with an only way to upload files on your webserver: FTP. If you are used to push changes to a git server like and then pull the changes on your webserver through a SSH connection, there are chances you just can't do that.
I had this problem recently, but I luckily found a very nice shell, git-ftp, that allows me to push changes to the webserver via FTP, recognizing all the files that I updated (or added, or deleted) on my local git repository.
My development / deployment cycle goes now simply like this:
$ # the ordinary "git" stuff...:
$ git add files_to_add
$ git rm files_to_remove
$ git commit -a -m 'description of the changes'
$ git push
$ # now push the changes on the deployment server:
$ git-ftp push
The credentials that have to be used to access the deployment server are stored with the ordinary git config command. From the man page of git-ftp:
$ git config git-ftp.user john
$ git config git-ftp.url ftp.example.com
$ git config git-ftp.password secr3t
$ git config git-ftp.syncroot path/dir
$ git config git-ftp.cacert caCertStore
$ git config git-ftp.deployedsha1file mySHA1File
$ git config git-ftp.insecure 1
You can also use git-ftp in dry-mode, to just see what would happen, and in verbose mode, if you are curious about what actually happens.
Update: for a different project, I needed to use sftp, which is not supported by libcurl with a default Ubuntu installation. Anyway, I followed the instructions found on zeroset and I managed to have it working with no problems.