Final Review

Izzatul Muttaqin
PPL C6 Big Data
Published in
5 min readMay 29, 2019
A title episode of animation

Within this sixth individual review, I’ll explain about three points such as Docker Orchestration, Scalability & Profiling, and Testing: Stress and Penetration.

Docker Orchestration

Docker is a tool designed to make it easier to create, deploy, and run applications by using containers. Thus Docker Orchestration is Container orchestration is all about managing the lifecycles of containers, especially in large, dynamic environments with Docker as container.

Dockerfile

Dockerfile is where you tell the orchestration tool where to gather container images, how to establish networking between containers, how to mount storage volumes, and where to store logs for that container. Our DevOps Ashlah branch and version control these configuration files so they can deploy the same applications across different development and testing environments before deploying them to production clusters thus the deployment could be used by using merge to gitlab staging branch → pass test (include test lint) → deployed. When the CI/CD within gitlab is done, the web is could be used.

Scalability & Profiling

Scalability is is an attribute that describes the ability of a process, network, software or organization to grow and manage increased demand. A system, business or software that is described as scalable has an advantage because it is more adaptable to the changing needs or demands of its users or clients.

Profiling is a form of dynamic program analysis (as opposed to static code analysis), is the investigation of a program’s behavior using information gathered as the program executes. The usual purpose of this analysis is to determine which sections of a program to optimize — to increase its overall speed, decrease its memory requirement or sometimes both.

Scalability

Promise snippet for accessing the database

Within our server, the usage of asynchronous used to achieve scalability. Within our project using express, asynchronous is used by Promise(success, fail) and then to solve the success and fail condition of the program. This applied within upload and get data from the postgres database so the server is able to maintain the same performance as the volume of data increases.

Profiling

To profile server of express node.js, V8 Profiler is used which is a default profiler from node.js. In order to get the profiler works, a command is used to run it.

node -prof ./app.

The profiler will create a log file, within the test, a isolate-000002140EC4B010-v8.log is resulted from the command above. To check contain of the log file command.

node -prof-process isolate-000002140EC4B010-v8.log

Within the picture, is shown that the highest ticks of the server is 164 ticks. The ticks is spent mostly on modules, thus it would be hard thing to improve and optimize the server.

Testing: Stress and Penetration

In Software Engineering, Stress Testing is also known as Endurance Testing. Under Stress Testing, Stress is be stressed for a short period of time to know its withstanding capacity. A most prominent use of stress testing is to determine the limit, at which the system or software or hardware breaks. It also checks whether the system demonstrates effective error management under extreme conditions.

Penetration test, also known as a pen test, is a simulated cyber attack against your computer system to check for exploitable vulnerabilities. In the context of web application security, penetration testing is commonly used to augment a web application firewall (WAF).

Stress Testing

Apache Benchmark is used to stress testing of upload file within server with 10 requests and 2 concurrency. From the result, the total request completed is 10. Test runs in 29.545 seconds with 0 failed request and the total data transferred is around 10.02 MB as well as time per request is around 5908.964 ms.

Penetration Testing

1. Planning and reconnaissance
The first stage involves:

  • Defining the scope and goals of a test, including the systems to be addressed and the testing methods to be used.
  • Gathering intelligence (e.g., network and domain names, mail server) to better understand how a target works and its potential vulnerabilities.

2. Scanning
The next step is to understand how the target application will respond to various intrusion attempts. This is typically done using:

  • Static analysis — Inspecting an application’s code to estimate the way it behaves while running. These tools can scan the entirety of the code in a single pass.
  • Dynamic analysis — Inspecting an application’s code in a running state. This is a more practical way of scanning, as it provides a real-time view into an application’s performance.

3. Gaining Access
This stage uses web application attacks, such as cross-site scripting, SQL injection and backdoors, to uncover a target’s vulnerabilities. Testers then try and exploit these vulnerabilities, typically by escalating privileges, stealing data, intercepting traffic, etc., to understand the damage they can cause.

4. Maintaining access
The goal of this stage is to see if the vulnerability can be used to achieve a persistent presence in the exploited system — long enough for a bad actor to gain in-depth access. The idea is to imitate advanced persistent threats, which often remain in a system for months in order to steal an organization’s most sensitive data.

5. Analysis
The results of the penetration test are then compiled into a report detailing:

  • Specific vulnerabilities that were exploited
  • Sensitive data that was accessed
  • The amount of time the pen tester was able to remain in the system undetected

--

--