Summer Camp 2020: Web Application Exploitation

Aaron James walks us through how he would solve the Summer Camp 2020 – Web Application Exploitation challenge: Black Mesa.

Hey everybody, it’s Aaron.

If you caught the Black Mesa Challenge yesterday, hopefully you’ve had a chance to take a crack at it. If not, and you don’t want to spoil the puzzle for yourself, head back on over to the original challenge and fight on. This post will be a guided walk-through of the solution.

If you have prerequisite knowledge of git, this challenge will be on the easier side. Otherwise, it will be a tad more difficult. In either case, if you are flexible with your Google-fu, you can get to the solution regardless.

First, we begin by pulling up the front page of the Black Mesa Website. You might see a lot of paragraphs and images that seem directed at you, but that’s just Black Mesa trying to taunt you. Avoid the distractions and focus on the prompt:

...try and break in to their admin portal to steal their flag...

We might then think the conspicuous Log in button at the top right-hand corner of the screen might get us somewhere.

Unfortunately, after clicking, we are greeted with yet another taunt:

There is a login page, but we're not just going to tell you where it is.
For all we know, you could be a hacker.

Well, this is frustrating. It’s probably not intentional, but there’s definitely some useful information here. For example, its clearly telling us there IS a hidden login page. We just need to find it.

Remember: We shouldn’t run a brute-forcer like dirbuster on this website. It’s not very powerful and my poor rig would probably die. Let’s avoid that.

If there’s one piece of advice I have that applies to ALL of web exploitation, its this:

If you want to manually check for one file on a web server, check /robots.txt

Robots.txt is a file that many web servers provide at the root directory of an open website for the benefit of search engine web crawlers (ie. Google) to properly index that website in the owner’s intended fashion. It can contain a variety of directives, but mainly ones that permit or disallow a web crawler from indexing certain web pages on the website.

Interestingly enough, although they aren’t intended for human viewers, you can also pull up these files straight in your browser. Check out Facebook’s Robots.txt. You’ll notice that it contains a lot of different file paths that Facebook needs but doesn’t want cluttering search engines, all as file paths implied to be on Facebook.com itself.

A lot of the time, we can use the paths in these files to find out where the creator of the website doesn’t want us to go. After all, as hackers, that might be exactly where we want to go.

Anyways, we pull up Black Mesa’s robots.txt and are greeted with some interesting information:

User-agent: *
Disallow: /admin.html
Disallow: /.git/

Okay, fascinating. The first thing one notices is the /admin.html file that you can sure enough browse to:

Trying to login with no email, no password, non-emails in the email field, random characters, large inputs, and pretty much anything else you can think of all result in the same error:

Hmm. Well maybe during a field assignment you might consider brute forcing the login page with SQL Injection strings or password attempts, but not only is that technically against the rules of engagement (no automated brute-forcing), but we don’t exactly have any additional info to start trying logins with. This particular login page gives no real information about email existence or any other kind of error, so we’ll probably need to come back here once we get further.

On the other hand, the second entry in the robots.txt file is very interesting. Those well-acquainted with git version control will already know exactly what it is. Those not acquainted as well with git will still probably be able to make the connection with some well-executed Googling.

The most important take-away is to realize that an exposed git folder is almost definitely a huge source of potentially dangerous information. Namely, it is almost always used to manage source code of some kind, perhaps even the source code of the very web server you’re currently interacting with. That would certainly help for the purposes of finding vulnerabilities.

Here’s what I pull up when I search “exposed git folder” on Google:

This first article will essentially walk you through the entire process of downloading the git folder from the Black Mesa and reconstructing the source code from it. It even provides a link to a handy tool that will automate most of the process. I’ll let that article explain the details.

If you want more technical information about how this works, check out this great, recent Medium article that appears just below that first search result and talks about the same subject in greater detail.

Here’s basically the commands I ran to download the entire source code folder for Black Mesa’s website, using the gitdumper tool:

# download the scraper
curl <https://raw.githubusercontent.com/internetwache/GitTools/master/Dumper/gitdumper.sh> --output ./gitdumper.sh
# run it
chmod u+x ./gitdumper.sh
./gitdumper.sh <https://blackmesa.irs.sh/.git/> ./blackmesa
# use the git info to restore the source code
cd ./blackmesa
git checkout -- .

Now that we have the source code of the website sitting in front of us, there’s quite a lot to look at. It’s almost confusing where to look first.

You can take a look at the source code for the website and everything, but still no flags. Well, except this code inside app.js that SEEMS to send us the flag from a file if we successfully log in:

Unfortunately, there isn’t any secret.flag file inside this folder we downloaded. What gives? If we check .gitignore, the file that dictates what git keeps record of and what git ignores, we find out very quickly:

Black Mesa was one step ahead of us! Despite downloading the entire program source code, the flag wasn’t being tracked by git, so its probably only on the server, and we can’t download it through gitdumper. If we want it, we’ll have to log into the server admin portal with a valid email-password combo…

But look on the bright side. There was at least one file tracked on git that probably shouldn’t have been:

The entire user database! Let’s open it in a text editor!

Oh, darn it. It’s a binary file and its been encoded somehow.

Looking back at the source code reveals its a sqlite3 database file. A speedy Google search for “sqlite3 database file viewer” quickly reveals a handy website that will show us what’s inside:

Okay sweet! We have some valid emails! Unfortunately, those passwords are hashed, so we can’t really log in with them. We’d need to crack them with a tool like hashcat, but to be honest, I would give up before we even start. By looking at these hashes I can identify them as bcrypt, which can be one of the most difficult password hashing algorithms to crack. We simply don’t have enough information to dive down that rabbit hole quite yet, so lets keep looking around.

It’s frustrating when you’ve fought to learn so much information about the application, and you still don’t seem to have the last missing piece you need to reveal the solution. In times like these, take a deep breath, a step back, and think gently about everything you have available to you.

Remember that this challenge is oriented around git! A ton of valuable information is stored in git, not just information about the current state of the project, but every state its ever had in the past too.

Open up your terminal, we’re going to try and dig a little deeper into the git repository. Let’s see what kind of history this repo has by using the git log command:

Here we see exactly two commits, that is, two versions of this application’s source code. The commit message of the second commit says simply, “Updated to bcrypt”. Hm. That’s interesting. Does that imply that this project once used something other than bcrypt?

With git, we can rollback the repository to a previously recorded state, in this case, before bcrypt was added as a hashing algorithm. To do this, we use the git checkout command:

Notice we pass the string of seven hex characters from the git log output to identify the version we want to go back to. We also get a warning about a “detached HEAD state”, but for us that won’t matter.

Git just reverted some files back to their original versions before the bcrypt update. What actually changed though? Well git can tell you that too, with the git diff command:

https://s3.us-west-2.amazonaws.com/secure.notion-static.com/b2e8947a-1c20-4239-94dc-57bacd8f1920/Untitled.png?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAT73L2G45O3KS52Y5%2F20200502%2Fus-west-2%2Fs3%2Faws4_request&X-Amz-Date=20200502T224218Z&X-Amz-Expires=86400&X-Amz-Signature=cf8b723b0539cbcade90186789a3fd72665d77d05484b1c1445fbfef90823b4a&X-Amz-SignedHeaders=host&response-content-disposition=filename%20%3D%22Untitled.png%22

We pass the word “master” to indicate that we want the difference between the current state (which at the moment is the checked out version from the past) and the “master”, which means the latest version. In other words, what files changed?

Looking at this compact diff report, we see that exactly four files changed. One of them in particular is very interesting; the database appears to be different now that we’ve rolled it back. Let’s open it back up in the sqlite3 viewer and see what it looks like now:

https://s3.us-west-2.amazonaws.com/secure.notion-static.com/796afe72-e59f-419f-b08d-146eb9d7d032/Untitled.png?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAT73L2G45O3KS52Y5%2F20200502%2Fus-west-2%2Fs3%2Faws4_request&X-Amz-Date=20200502T224236Z&X-Amz-Expires=86400&X-Amz-Signature=67dddea6dc1670d53b196a4dce373d19384ec9851bbec54d4eaf0677397ad820&X-Amz-SignedHeaders=host&response-content-disposition=filename%20%3D%22Untitled.png%22

Wow, the emails are the same, but the passwords look way different now. They look more like weak md5 hashes than bcrypt hashes, and reading the rolled-back login algorithm in the source code app.js confirms this suspicion.

We might be able to crack these passwords incredibly easily. Let’s first try loading them into an online database cracker like CrackStation and see if we get lucky with it being in there. I load all 12 hashes into the input and hit crack:

https://s3.us-west-2.amazonaws.com/secure.notion-static.com/be662384-d4b4-40c5-849b-bf4314241167/Untitled.png?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAT73L2G45O3KS52Y5%2F20200502%2Fus-west-2%2Fs3%2Faws4_request&X-Amz-Date=20200502T224251Z&X-Amz-Expires=86400&X-Amz-Signature=35f48053d9f484c507829843b3b494b8fbf861303d37f7a051f4d6a3ae62ec1e&X-Amz-SignedHeaders=host&response-content-disposition=filename%20%3D%22Untitled.png%22

Interestingly, most of them were fine, but one of them in particular was found quite easily. Apparently, its plaintext is weakpassword, and looking at the database, we see it corresponds to the email aaron@blackmesa.irs.sh.

I don’t know who this Aaron guy is but he needs to take some password tips from his friends. Let’s hurry up and try these credentials on the Black Mesa admin panel:

https://s3.us-west-2.amazonaws.com/secure.notion-static.com/9be848b5-f59d-4536-8b60-24c54083c622/Untitled.png?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAT73L2G45O3KS52Y5%2F20200502%2Fus-west-2%2Fs3%2Faws4_request&X-Amz-Date=20200502T224302Z&X-Amz-Expires=86400&X-Amz-Signature=f7cc1b5e2f340a5fa7e8cb0bd41d0cf9c951c5a636a4be7ea2fab910f93f2abd&X-Amz-SignedHeaders=host&response-content-disposition=filename%20%3D%22Untitled.png%22

And there we have it folks! 🎉🎉

Thank you so much for coming along for the ride, this challenge was super exciting to set up and exploit! I hope you enjoyed it, and if you have any further questions, just post in the comments below.

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.