المشاركات

Auto - Encoders

صورة
 Autoencoders are: Artificial neural networks. Capable of learning efficient representations of the input data, called codings, without any supervision. The training set is unlabeled. These codings typically have a much lower dimensionality than the input  data, making autoencoders useful for dimensionality reduction Why use Autoencoders? Useful for dimensionality reduction Autoencoders act as powerful feature detectors, And they can be used for unsupervised pre-training of deep neural  networks Lastly, they are capable of randomly generating new data that looks very  similar to the training data; this is called a generative model Surprisingly, autoencoders work by simply learning to copy their inputs  to their outputs This may sound like a trivial task, but we will see that constraining the  network in various ways can make it rather difficult For example You can limit the size of the internal representation, or you can add noise to the inputs and train th...

PROBABILITY

صورة
 DEFINITION OF PROBABILITY The probability of a certain event A happening in sample space S is given by the ratio between the number of cases favorable to the event (nA) and the total number of possible cases (n): BASIC PROBABILITY RULES 1. Probability Variation Field The probability of an event A happening is a number between 0 and 1: 2. Probability of the Sample Space Sample space S has probability equal to 1: 3. Probability of an Empty Set The probability of an empty set (f) occurring is null: 4. Probability Variation Field The probability of event A, event B or both happening can be calculated as follows: 5. Probability of a Complementary Event If Ac is A’s complementary event, then: 6. Probability Multiplication Rule for Independent Events If A and B are two independent events, the probability of them happening together is equal to the product of their individual probabilities:  

Recurrent Neural Network (RNN)

صورة
 What are recurrent neural networks? A recurrent neural network (RNN) is a type of artificial neural network which uses sequential data or time series data. These deep learning algorithms are commonly used for ordinal or temporal problems, such as language translation, natural language processing (NLP), speech recognition, and machine translation; they are incorporated into popular applications such as Siri, voice search, and Google Translate. Like feedforward and convolutional neural networks (CNNs). Why Recurrent Neural Networks? RNN were created because there were a few issues in the feed-forward neural network: Cannot handle sequential data Considers only the current input Cannot memorize previous inputs The solution to these issues is the RNN. An RNN can handle sequential data, accepting the current input data, and previously received inputs. RNNs can memorize previous inputs due to their internal memory. LSTM (Variant of RNN) Starting at the beginning, let’s define our neural...