Sponsored by

Conference notes: Small Files And Big Bounties, Exploiting Sensitive Files (LevelUp 0x02 / 2018)

Posted in Conference notes on July 4, 2018

Conference notes: Small Files And Big Bounties, Exploiting Sensitive Files (LevelUp 0x02 / 2018)

Hi, these are the notes I took while watching the “Small Files And Big Bounties, Exploiting Sensitive Files” talk given by Sebastian Neef and Tim Philipp Schäfers on LevelUp 0x02 / 2018.

About

This talk is about how to extract information from sensitive files like .DS_Store files and .git directories.

History of sensitive files

/robots.txt

  • File to control the file indexing by search engines
  • Disallow: /path/to/sensitive/hidden.file

Problem

  • Does not block access to the file itself
  • Tells hackers where you don’t want them to look

WS_FTP Professional

  • FTP client, released in 1993, > 40 million users

Known vulnerability

  • Some versions upload a file named ‘ws_ftp.ini’
    • config file, contains (weakly) encoded password, hosts & directories

A closer look at .git directories

  • Git is a Version Control System (VCS) developed by Linus Torvalds
  • Used to manage source code
  • Commands: git init / add / commit / push / pull
  • Data is stored in the .git directory
  • Everything is based on objects in Git (Git Internal - Git Objects)

Problem

  • If the .git directory ends up on a Web server
  • Happens if the deployment process is cd /var/www/html && git pull
  • Then the /.git/ folder might be accessible!

Why is it bad?

  • If directory listing is enabled
    • An attacker can download the whole repository with wget
    • Demo
wget --mirror --include-directories=/.git http://domain.tld/.git/
git status 			# Returns that the files were deleted because folders are empty
git checkout -- . 	# To restore the files & download the directory
git log				# See what other commits are there
  • If directory listing is disabled
    • Getting the whole directory is hard
    • But two files are always accessible (if there’s a master branch):
      • .git/HEAD
      • .git/refs/heads/master
    • They can be used to obtain one object hash
    • With it you can download the object file & get new object hashes
    • Repeat until nothing new is found!
    • You end up with parts or the full repository on your hard disk
    • Tool to automate this process: GitTools
    • Demo
./gitdumper.sh http://demo.local/.git/ /output-directory/
git status 			# Returns that the files were deleted because folders are empty
git checkout -- . 	# To restore the files & download the directory
git log				# See what other commits are there

Consequences

A closer look at .DS_Store files

Problem 1

  • This filename starts with a dot, so the Finder won’t show it by default
    • This is bad for you if you share a ZIP file or other files using a USB stick. Other people might get information about what files are stored in the directory as well even if you don’t copy the whole directory.

Problem 2

  • If you use a Mac for development & you use a tool like scp/rsync/ftp to transfer files from your local development environment to a server
  • For example if the deployment process is scp / rsync / ftp / ./code/ server:/var/www/html/
  • And if you don’t exclude .DS_Store files from the file transfer
  • Then …
    • All files, including .DS_Store, are transfered & exposed
    • The .DS_Store in each directory contains the list of files present in the remote directory. This is why you get a conflict notification if you try transfering a file that’s already existing on the server
    • => You can know if a directory exists on the server without bruteforcing & even if directory listing is disabled
    • You can also iterate the process (recursively get the .DS-Store file of each subdirectory) to obtain the whole remote file structure
  • Tools for automation:

Demo

$ file samples/.DS_Store.ctf
samples/.DS_Store.ctf: Apple Desktop Services Store
$ hexdump -C samples/.DS_Store.ctf
# Output: A lot of binary data, you can't read much
$ python main.py samples/.DS_Store.ctf
# Output: File names recovered

Consequences

  • Directory listing ‘bypass’
  • Disclosure of (probably) accessible files on the server (without using bruteforce or having directory listing enabled)
    • Backup files
    • Database files
    • Temporary files
    • Key files (e.g. private keys from SSL enabled Web servers)
    • Config files
  • Files: .bak, .gz, .db, .eml, .old, .inc, .config, .sql, .pem, …
  • ~10k from Alexa’s Top 1M are affected (Scanning the Alexa Top 1M for .DS_Store files
  • Examples of similar bugs found by bug bounty hunters:
$ curl 'http://api.../.DS_Store' -o .DS_Store
$ ./fdb.py --type ds --filename .DS_Store --base_url http://api.../

Other files

  • There are a lot more files with sensitive content
  • Be motivated to explore them & share your knowledge
  • Examples:
    • .svn/
    • .idea/
    • .swp
    • .htpasswd
    • .coredump
    • winscp.ini
    • filezilla.cml
    • /server-status

Resources


See you next time!

Top