“An overhead shot of two people planning a trip with a map and a laptop on a wooden surface” by rawpixel on Unsplash

7. Searching 2

Yohan Hyunsung Yi
Journey to Tech Company
3 min readMay 20, 2018

--

-

7. 2. Brute-Force

Brute: Animals, animals
Force: Force
As you can see from the name, it is a very simple ignorant algorithm.
In order to solve the problem, it is a way to do it all in all possible cases.
If the problem gets a bit more complicated, the bigger the number, the more time it takes.
So the brute force algorithm is very inefficient in terms of time.
However, the brute force algorithm is as easy to create.

Beyond that, the advantages of the brute force algorithm are different.
It helps before thinking about other algorithms.
This is the starting point to think of a more advanced algorithm.

7. 3. Jump Point Search

This section and the next describe the mechanical details and algorithmic properties of Jump Point Search. If you don’t care for such things, feel free to jump ahead and check out some screenshots.
JPS can be described in terms of two simple pruning rules which are applied recursively during search: one rule is specific to straight steps, the other for diagonal steps. The key intuition in both cases is to prune the set of immediate neighbours around a node by trying to prove that an optimal path (symmetric or otherwise) exists from the parent of the current node to each neighbour and that path does not involve visiting the current node. Figure 1 outlines the basic idea.

7. 4. Linear Search

There is a disadvantage in that it is not necessary to sort or touch data, it is easy, but when the amount of data increases, the time required for search increases proportionally and each one is compared inefficiently. For example, if you have a set of data like this, you need 10 comparisons to find 4. If you have 1 million data and the data you are looking for is one million times, then you have to make one million comparisons. This situation is called Worst Case. The best case is called a good situation.

7. 5. Power Set

To list all subsets of {a, b, c, d, e, f}

  • List all subsets of {b, c, d, e, f} except a.
  • List sets that add {a} to all subsets of {b, c, d, e, f}.
def power_set_1(set_):
def k_subsets(k, start_ind):
if k == 0:
return [[]]
else:
subsets = []
for ind in xrange(start_ind, len(set_) - k + 1):
for subset in k_subsets(k - 1, ind + 1):
subsets.append(subset + [set_[ind]])
return subsets

subsets = []
for k in xrange(len(set_) + 1):
for subset in k_subsets(k, 0):
subsets.append(subset)
return subsets

7. 6. Optimization Problem

The optimal algorithm is one in which all algorithms that solve the same problem do not have algorithms that perform fewer major operations than the algorithm.

  • Step 1: We devise a W (n) algorithm that is considered efficient for a given problem. The algorithm finds the workload needed in the worst case. Where n represents the size of the input data.
  • Step 2: For function F, any kind of algorithm that can solve a given problem proves that a workload of at least F (n) is absolutely necessary.
  • Step 3: If an algorithm satisfies W (n) = F (n), then the algorithm is the optimal algorithm to solve the problem.

--

--