Hi, these are the notes I took while watching the “Small Files And Big Bounties, Exploiting Sensitive Files” talk given by Sebastian Neef and Tim Philipp Schäfers on LevelUp 0x02 / 2018.
Link
About
This talk is about how to extract information from sensitive files like .DS_Store files and .git directories.
History of sensitive files
/robots.txt
- File to control the file indexing by search engines
Disallow: /path/to/sensitive/hidden.file
Problem
- Does not block access to the file itself
- Tells hackers where you don’t want them to look
WS_FTP Professional
- FTP client, released in 1993, > 40 million users
- Some versions upload a file named ‘ws_ftp.ini’
- config file, contains (weakly) encoded password, hosts & directories
A closer look at .git directories
- Git is a Version Control System (VCS) developed by Linus Torvalds
- Used to manage source code
- Commands:
git init
/add
/commit
/push
/pull
… - Data is stored in the .git directory
- Everything is based on objects in Git (Git Internal - Git Objects)
Problem
- If the .git directory ends up on a Web server
- Happens if the deployment process is
cd /var/www/html && git pull
- Then the /.git/ folder might be accessible!
Why is it bad?
- If directory listing is enabled
- An attacker can download the whole repository with
wget
- Demo
wget --mirror --include-directories=/.git http://domain.tld/.git/ git status # Returns that the files were deleted because folders are empty git checkout -- . # To restore the files & download the directory git log # See what other commits are there
- An attacker can download the whole repository with
- If directory listing is disabled
- Getting the whole directory is hard
- But two files are always accessible (if there’s a master branch):
- .git/HEAD
- .git/refs/heads/master
- They can be used to obtain one object hash
- With it you can download the object file & get new object hashes
- Repeat until nothing new is found!
- You end up with parts or the full repository on your hard disk
- Tool to automate this process: GitTools
- Demo
./gitdumper.sh http://demo.local/.git/ /output-directory/ git status # Returns that the files were deleted because folders are empty git checkout -- . # To restore the files & download the directory git log # See what other commits are there
Consequences
- Source code disclosure
- Get the source & find other vulns
- Find committed credentials & escalate privileges
- ! Only look further if it is allowed by the BBP’s rules !
- In some cases .git/config contains HTTP-BasicAuth credentials
- Instant access to company’s repositories (GitLab / GitHub …)
- Access to the CI (e.g. GitLabCI): Build scripts & auto-deployment may lead to server pwnage
- ~10k out of Alexa’s Top 1M are affected! (Don’t publicly expose .git or how we downloaded your website’s sourcecode - An analysis of Alexa’s 1M)
- Examples of similar bugs found by bug bounty hunters:
A closer look at .DS_Store files
- Apple’s proprierary Desktop Service Store format on MacOS => not easy to read
- Holds meta information (e.g. icons, file name, attributes) about files in a directory
- Hidden and automatically created when entering a directory using ‘Finder’
- Blog post on the .DS_Store file format
Problem 1
- This filename starts with a dot, so the Finder won’t show it by default
- This is bad for you if you share a ZIP file or other files using a USB stick. Other people might get information about what files are stored in the directory as well even if you don’t copy the whole directory.
Problem 2
- If you use a Mac for development & you use a tool like scp/rsync/ftp to transfer files from your local development environment to a server
- For example if the deployment process is
scp / rsync / ftp / ./code/ server:/var/www/html/
- And if you don’t exclude .DS_Store files from the file transfer
- Then …
- All files, including .DS_Store, are transfered & exposed
- The .DS_Store in each directory contains the list of files present in the remote directory. This is why you get a conflict notification if you try transfering a file that’s already existing on the server
- => You can know if a directory exists on the server without bruteforcing & even if directory listing is disabled
- You can also iterate the process (recursively get the .DS-Store file of each subdirectory) to obtain the whole remote file structure
- Tools for automation:
- Parsing: Python .DS_Store parser
- Recursively enumerating/checking referenced files: .DS_Store Scanner
Demo
$ file samples/.DS_Store.ctf
samples/.DS_Store.ctf: Apple Desktop Services Store
$ hexdump -C samples/.DS_Store.ctf
# Output: A lot of binary data, you can't read much
$ python main.py samples/.DS_Store.ctf
# Output: File names recovered
Consequences
- Directory listing ‘bypass’
- Disclosure of (probably) accessible files on the server (without using bruteforce or having directory listing enabled)
- Backup files
- Database files
- Temporary files
- Key files (e.g. private keys from SSL enabled Web servers)
- Config files
- Files: .bak, .gz, .db, .eml, .old, .inc, .config, .sql, .pem, …
- ~10k from Alexa’s Top 1M are affected (Scanning the Alexa Top 1M for .DS_Store files
- Examples of similar bugs found by bug bounty hunters:
- Information Disclosure through .DS_Store in xxx
- $250 reward for a similar issue found on a private program by the presenters (report is undisclosed), using fdb
$ curl 'http://api.../.DS_Store' -o .DS_Store $ ./fdb.py --type ds --filename .DS_Store --base_url http://api.../
Other files
- There are a lot more files with sensitive content
- Be motivated to explore them & share your knowledge
- Examples:
- .svn/
- .idea/
- .swp
- .htpasswd
- .coredump
- winscp.ini
- filezilla.cml
- /server-status
Resources
- Tools & resources
- Blog posts & talks
- Tools
See you next time!
Comments