Theme NexT works best with JavaScript enabled
0%

[daily daily]2022-01-11~2022-01-14

My Flow Record during 2022-01-11 to 2022-01-14 ^ _ ^

2022-01-11

Last day, I asked Professor Chen the question “When could I go home?”. In the moring today, I get the answer “Within five days before Spring Festival”. It is sad. I don’t want to work, but I need to work.

In addition, I need to change my ticket of bullet train from 19th to 25th. It is also sad, because I only refund successfully, but buying new ticket failed. ZhiXing, the software to buy ticket, should be blamed for the sad thing.

Rpeatly attempts for ticket wasted lots of time of mine. It was an inefficient morning. The only bussiness thing I did was watching a video explaining the paper in the bilibili.

It is about video generation. Sounds like a variation or advanced modification of VQ-VAE, which means “Vector Quantization - Variant Auto Encoder”. The core idea is use 3 different scales to quantize.

In the afternoon, I read some code about seq2seq. It is cloned in the guthub. Overall, it has 6 chapters. I was reading chapter1.

The overall architecture of seq2seq is encoder-decoder. Both encoder and decoder are composed of RNN or variant RNN(e.g. LSTM).

In the evening, I took part in a massive NUCLEIC acid screening till 8:30pm. After that, I browsed website(specially ZhiHu) for a long while. Then, chatting with some old classmates. Finnaly, I went back to dormitory, watching American TV serias, specially “The Good Doctor” utils 2:00am in the next day.

What a decadent day!!!

2022-01-12

My Plan:

  • Finish the code reading of seq2seq, unscramble ipynb tutorial to python script.

Good news, in the morning, I got my new bullet ticket of 26th in the app 12306.

Sometimes I think the Win11 is very slow, I want to restore it back to Win10. But I found it is very tedious if the update of Win11 has been updated more than a week. So I gave up. And another question, if I add Ubuntu as the second operating system, what benefit I will get? Should I do this work?

Today I knew a new python package named spaCy, which provides lots of models with tagging, parsing, lemmatization and named entity recognition. I can download models whatever I want from this webpage. And the corresponding document is there

Dependency Syntactic Parsing, one import part in NLP domain. I’d love to figure out what it is to make up my poor undestanding in class. “Dependency” is a realtionship between words which is unequal. In this relationship, one party is the governor/regent/head, while another party is modifier/subordinate/dependent. So, Dependency Analyze is to analyze the relationship of words in the same sentence.

Parameter List about some function in torch.nn

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
'''torch.nn.Embedding
- num_embeddings: normally the input dim
- embedding_dim
'''

'''torch.nn.LSTM
- input_size: the dimension of input. Normally the dimension of embedding dim.
- hidden_size: the dimension of hidden unit.
- num_layers: the number of RNN layers
- bias: default=True
- batch_first: default=False.
Normally the shape of input is (batch_size, seq_length, embedding_dim).
So we would better exchange batch_size and seq_length.
- dropout: default 0
- bidirectional, default=False
'''

'''torch.nn.Linear
- in_features
- out_features
'''

2022-01-13

My Plan:

  • Finish the code reading of seq2seq, unscramble ipynb tutorial to python script.

In the morning, when I downloaded WhatsAPP again, I found mango is no more there any more. It is sad, so I deleted it again.

2022-01-14

Watering…