Reference material :

1、https://github.com/dragen1860/TensorFlow-2.x-Tutorials

2、《Adversarial Feature Learning》

This time is about reading BiGAN A record of the paper , Include myself for BiGAN Some understanding of

because BiGAN There is no big difference in code implementation , Even classic GAN( See :https://www.cnblogs.com/DAYceng/p/16365562.html), So there is no introduction here

Reference resources 1 Source code in

Still a disclaimer : Level co., LTD. , Please correct any mistakes , Thank you for the

notes : If the picture doesn't come out, you may need to fq, lately jsdelivr The agent seems to have hung up .

two-way GAN The Internet (BiGAN)

BiGAN Compare with GAN Is more of a Structurally Improvement , In addition to standard GAN Generator in the framework G,BiGAN An encoder has also been added E

structure

The whole structure consists of three parts :Encode The Internet ( Encoder E),G The Internet ( generator G),D The Internet ( Judging device D)

  • Encode The Internet , Extract the hidden variables of the original picture

  • G The Internet , Generate the noise into a picture

  • D The Internet , Judge whether this data is correct ( Original image and hidden variables Generate pictures and noise ) It's from the encoder E Or generator G

According to the paper :“ Encoder E Put the data x Mapping to potential representations z”

Combined with the structure diagram ,“ Potential representation z” It should correspond E(x), That is to say “ The mapping of real data in the corresponding noise domain

According to the article , Encoder E The given corresponding codes have some semantic characteristics , therefore E(x) It can be considered data x Some kind of label

Judging device D Judging what ?

This question is a little strange at first , Because the classic GAN Yes to judge “ Whether the input data is real data ”.

and BiGAN It's different in , The input here becomes a data pair (【 data , Transformation of data in noise domain 】)

And in BiGAN in , The real data has not been directly input into the discriminator from beginning to end D, in other words Judging device D Never seen real data

Judging device D The characteristics of the real data learned are all encoders E“ tell ” its

And the encoder E What are you doing? ? It is learning ( extract ) Characteristics of real data , These features may be more abstract features of higher dimensions , It may help to depict the original real data , There may be no effect . But the encoder E I don't know that , It can only extract features as much as possible ( This is how we train coders E The process of )

And because of our generator G Pictures will also be generated , These pictures also have corresponding so-called features , Because fake pictures are generated by random noise , So random noise is the characteristic of these false pictures . therefore , Encoder E The extracted features of the real image also need to fall into the same spatial domain as the random noise , Only in this way can we judge .( Calculate similarity )

So now we can sum up :

BiGAN The discriminator in D stay " distinguish " What is the current input Encoder E Given the characteristics of real data , still here generator G Used to generate fake pictures Random noise

Encoder E And generators G Purpose

See the discriminator D What are you doing? , Now let's understand the encoder E And generators G What are you doing

We will naturally draw analogies to classics GAN Come to the conclusion : Encoder E And generators G Try to cheat the discriminator as much as possible D

In order to achieve this goal , Encoder E And generators G Need to be close to each other's form , Corresponding to the original text is :

we will both argue intuitively and formally prove that the encoder and generator must learn to invert one another in order to fool the BiGAN discriminator.

" Encoder E And generators G You need to learn to reverse each other "

A popular explanation is : For encoders E Come on , Although it really extracts features from real data , But in the process of continuous training , It also gradually learned about generators G Features of the generated picture , therefore Encoder E Will deliberately imitate the characteristics of the false picture and send it to the discriminator D, Test whether the discriminator can distinguish .

This is it. BiGAN The structure diagram of why the encoder E And generators G The reason why the input and output boxes are together , Because they are connected with each other ( It is possible to interchange )

【 Generation versus network learning third 】BiGAN Paper reading notes and more related articles on the understanding of the principles

  1. Generative antagonistic network (GAN) Related links summary

    1. Basic knowledge of Introduction of the founder : “GANs The father of ”Goodfellow 38 Minutes of video : How to improve the generation of confrontation network ?( On ) “GAN The father of ”Goodfellow Interact with netizens : About GAN Of 11 A question ( Video attached ) Jin Yi ...

  2. Generative antagonistic network (Generative Adversarial Network) Reading notes

    Notes are constantly updated , Please wait patiently First of all, we need to have a general understanding of what the generation of countermeasure network is , Refer to Wikipedia's definition (https://zh.wikipedia.org/wiki/ Generative antagonistic network ): Generative antagonistic network ( English :Gener ...

  3. Deep learning - Generative antagonistic network GAN note

    Generative antagonistic network (GAN) from 2 An important part of : generator G(Generator): Generate data by machine ( Most of the time it's images ), The purpose is “ Cheated ” Judging device Judging device D(Discriminator): Judge that this image is real ...

  4. Deep learning framework PyTorch The study of a Book - Chapter vii. - Generative antagonistic network (GAN)

    Reference resources :https://github.com/chenyuntc/pytorch-book/tree/v1.0/chapter7-GAN Generate animation avatar GAN Solved the famous problem in unsupervised learning : Given a batch of samples , training ...

  5. A series of model evolution and learning notes related to small sample problems in artificial intelligence ( Two ): Generative antagonistic network GAN

    [ Said in the previous ] I am a novice blogger , The old white of ivory tower , Xiaobai of career field . The following is just my personal opinion , Welcome criticism and correction , No joy, no spray. ![ handshake ][ handshake ] [ Just a little longer ] This paper connects with the previous essay : Evolution and learning of a series of models related to small sample problems in artificial intelligence ...

  6. Less than 200 Line code , Teach you how to use Keras Build a generation confrontation network (GAN)【 turn 】

    Reprinted from :https://www.leiphone.com/news/201703/Y5vnDSV9uIJIQzQm.html Generative antagonistic network (Generative Adversarial Netwo ...

  7. Generative antagonistic network (Generative Adversarial Networks,GAN) On

    1. From Nash equilibrium (Nash equilibrium) Speaking of Let's first look at the economic definition of Nash equilibrium : The Nash equilibrium , It refers to such a strategy combination of participants , In this strategy combination , There is no benefit for any participant to change strategy alone . In other words, ...

  8. GAN Practical notes —— Chapter 7: semi supervised generation countermeasure network (SGAN)

    Semi supervised generation countermeasure network One .SGAN brief introduction Semi-supervised learning (semi-supervised learning) yes GAN One of the most promising fields in practical application , And supervised learning ( Each sample in the dataset has a label ) And unsupervised learning ( Don't make ...

  9. [ZZ] Valse 2017 | Generative antagonistic network (GAN) Review of annual research progress

    Valse 2017 | Generative antagonistic network (GAN) Review of annual research progress https://www.leiphone.com/news/201704/fcG0rTSZWqgI31eY.html?viewType ...

  10. Learn from what you know | AI Network security practice : Generative antagonistic network

    This paper is written by    Netease cloud releases . “ Learn from what you know ” It's a brand column created by Netease yunyidun , The words come from Chinese · Wang Chong < The theory of scale · Practical knowledge >. people , Ability is superior to inferior , Only by studying can we know the truth of things , Then there is wisdom , If you don't ask, you won't know .“ Learn from what you know ” ...

Random recommendation

  1. Linux Next EXT2 file system —— How to put ants and elephants in the refrigerator gracefully

    I'm really lazy these days , Fight against lazy cancer in your body all the time . I was defeated in the end , Sunday and Sunday are almost deserted in the past , In my spare time, I'm actually a little anxious , I don't know how to spend the time . I don't want to learn , Watch TV for entertainment ...

  2. ubuntu Common problems and solutions

    1. Automatic restart after repeated shutdown . Enter the following command in the networking state . sudo apt-get install laptop-mode-tools 2.win and ubuntu Two systems , How to modify the startup sequence and waiting time ? 1. open ...

  3. Head First Design patterns --4 Factory mode Abstract factory pattern

    ( The dependency inversion principle is used ) In the code we write , Sometimes, different instances may be instantiated in the method according to different parameters given outside , That is, it will change according to different parameters new Give different examples . If it's written like this , This code will be very fragile , Once out ...

  4. LintCode Search Insert Position

    Find out the designation target The location of ( If there is no such number, it should be in order ). public class Solution { /** * param A : an integer sorted array * para ...

  5. HDU 4793 Collision (2013 Changsha live game , Simple computational geometry )

    Collision Time Limit: 2000/1000 MS (Java/Others)    Memory Limit: 32768/32768 K (Java/Others)Total S ...

  6. oracle Database table building , Delete field , Add fields , Modify fields , Modify fields ......

    1. Use oracle Create a table : SQL> create table loginuser( id ,), username ), password ), email ), descriable ...

  7. UVa1449 - Dominating Patterns(AC automata )

    The main idea of the topic Given n A string of lowercase letters and a text string T, Your task is to find out which strings appear most frequently in the text Answer key A text string , Multiple pattern strings , This happens to be AC The problem that automata deal with Code : #include <i ...

  8. python note --2-- character string 、 Regular expressions

    character string ASCII Code adoption 1 Two bytes to encode characters , At most, it can only mean 256 Symbols . UTF-8 With 3 Bytes for Chinese GB2312 It is the Chinese code formulated by our country , Use 1 Two bytes for English ,2 Bytes for Chinese :GBK yes GB231 ...

  9. Python——day11 function ( object 、 The name space 、 Scope 、 nesting 、 Closure )

    One . Function object   Function name is the memory address where the function is stored , Variables that hold memory addresses are objects , namely Function name Namely Function object   Application of function object  1. Can be quoted directly   fn = cp_fn 2 . It can be passed as a function parameter c ...

  10. Ubuntu 14.04 install boost 1_57_0

    Reference resources : How to build boost 1_57_0 Ubuntu platform Ubuntu 14.04 install boost 1_57_0 $ sudo mkdir /opt/downloa ...