# AI Jargon Part 2

- Scaling: Changing your data so the units are all the same (taking hours, %, etc. and placing them on the same scale so you can directly compare them).
- Support Vector Machine Algorithm (SVMs): Better for working with precision on smaller datasets, these supervised algorithms are mostly used to classify data (more detail in our two week course)
- Deep High-order Neural Network with Structured Output (HNNSO): A type of neural network which allows you to examine the relationship between structured input data and structured output data.
- Recurrent Neural Networks: a type of artificial neural network that takes the order of inputs and outputs instead of treating them as independent. This makes it very good for Natural Language Processing (because the order of words is so important).
- noBackTrack algorithm: Helps to train recurrent neural networks without the need to backtrack (go back through each layer of nodes to make changes to improve outcomes).
- Long short term memory: This is a recurrent neural network architecture designed to solve the a problem those networks experience called “long-term dependencies” where as the distance between relevant information grows and the algorithm has a difficult time holding onto contextual information and using it.
- Hidden Markov Models: A type of Bayesian network in which there are hidden states between the input and the output.
- Deep belief network: A type of machine learning with hidden layers which can feed back into each other to calculate probabilities or perform classifications. It’s the precursor to deep learning.
- Convolutional Deep Networks: Another term for a Convolutional Neural Network (CNN)
- LAMSTAR: Large memory storage and retrieval neural networks increasingly being used in medical and financial applications
- Deep Q-network agent: Google DeepMIND uses this and it is based on reinforcement learning which is a major branch of psychology, aside from evolution.
- Logistic Regression: A classification algorithm (not a regression one) which happens to use regression techniques. It looks at a bunch of variables and predicts the probability of a data point falling into a specific category based on them.
- Tensor Flow: An open source library developed by Google Brain to build and train neural networks to discover patterns and correlations in data
- Hyperplane: Take however many dimensions you’re working with (3 or 2) and a hyperplane is that minus one. If you’re dealing with two dimensional space, it’s a line. In three dimensional space, it’s a plane.
- Neuro-fuzzy: A neural network in which the hidden layers have been programmed to operate based on fuzzy instead of Boolean logic.
- Restricted Boltzmann Machines (RBM): A type of artificial neural network which can learn a probability distribution over its set of outputs. These can be “stacked” to create deep belief and deep learning networks.
- Attribute Value Pairs: One of the most basic types of data consisting of an attribute and its value, represented as <attribute name, value>
- Discrete Output Values: When the value your algorithm produces is one of a finite set of values and not on a continuous scale. For example, what languages you speak as opposed to how much time you spend reading about AI.
- The Curse of Dimensionality: “The curse of dimensionality refers to how certain learning algorithms may perform poorly in high-dimensional data.” -Chi Zeng, Google engineer
- The Kernel Method: a class of pattern recognition algorithms which includes Support Vector Machines (SVMs) and which use kernel functions.
- Kernel: A small matrix used in image recognition AI to detect edges and other aspects of an image.

Want to learn more practical stuff about AI? If you’re in Sofia, Bulgaria, sign up for our fixer sessions. But if you’re really ready to figure out how you can apply AI to your business, sign up for our 2-week online course on Spotting Business Opportunities With AI.