Lecture

Neural Networks that Remember Sequences, RNN

RNN (Recurrent Neural Network) is a type of neural network that processes input and output by focusing on the order of data.

Unlike traditional neural networks that handle each input independently, RNN remembers information from previous inputs while processing current inputs.


When to Use RNN?

Data where the order of each element is important is known as Sequential Data, and particularly, a set of values arranged in temporal order is referred to as Time Series Data.

RNN is primarily used to process sequential data where the input order is crucial.

Here are some examples:

  • Sentences: The meaning of a sentence is completed when characters or words are sequentially connected.

  • Speech: The order of sounds must be considered to understand the meaning of speech.

  • Stock Price Analysis: When predicting stock prices that change over time, past data points are utilized.

RNN is advantageous for data where the flow of time or context is important.


How Does RNN Work?

RNN processes input not all at once, but in sequence, one at a time.

Simultaneously, it stores the information received up to that point internally and uses that information when processing the next input.

For instance, let's assume RNN processes the sentence "I am going to school."

When interpreting the phrase 'to school,' remembering the previous word 'I am' helps in understanding the sentence more accurately.

RNN is designed with a structure that can reflect this flow.

Example of Sequential Processing with RNN
Input: I am → going → to → school RNN processes 'going' while remembering 'I am' and processes 'to' while remembering 'I am, going.'

What Are the Limitations of RNN?

While RNN has the advantage of considering order, it has limitations in remembering old information well.

For instance, if a sentence gets too long or too much time passes, it might forget previous information and fail to make accurate predictions.

To address these issues, improved structures such as LSTM (Long Short-Term Memory) and GRU (Gated Recurrent Unit) have been developed.

These structures are designed to effectively remember old information.


In the next lesson, we will delve into the structure of RNN, exploring the principles by which it remembers sequences and processes information.

Mission
0 / 1

다음 중 빈칸에 가장 적절한 것은 무엇일까요?

RNN은 데이터의 을(를) 중점적으로 고려해 입출력을 처리하는 신경망입니다.
크기
종류
형태
순서

Lecture

AI Tutor

Design

Upload

Notes

Favorites

Help

image