https://github.com/aymericdamien/TensorFlow-Examples/blob/master/examples/3_NeuralNetworks/multilayer_perceptron.py
http://www.jessicayung.com/explaining-tensorflow-code-for-a-multilayer-perceptron/
http://www.juergenwiki.de/notes/deep_learning_tf_multi_layer_perceptron.html
http://students.washington.edu/adelak/2017/04/?p=350
TensorFlow Multilayer perceptron
- Antonio Linares
- Site Admin
- Posts: 42739
- Joined: Thu Oct 06, 2005 5:47 pm
- Location: Spain
- Has thanked: 105 times
- Been thanked: 108 times
- Contact:
- Antonio Linares
- Site Admin
- Posts: 42739
- Joined: Thu Oct 06, 2005 5:47 pm
- Location: Spain
- Has thanked: 105 times
- Been thanked: 108 times
- Contact:
Re: TensorFlow Multilayer perceptron
Classifying Text with Neural Networks and TensorFlow
https://medium.com/@Synced/big-picture-machine-learning-classifying-text-with-neural-networks-and-tensorflow-da3358625601
https://github.com/dmesquita/understanding_tensorflow_nn
https://medium.com/@Synced/big-picture-machine-learning-classifying-text-with-neural-networks-and-tensorflow-da3358625601
First, create an index for each word. Then, create a matrix for each text, in which the values are 1 if a word is in the text and 0 otherwise
https://github.com/dmesquita/understanding_tensorflow_nn
- Antonio Linares
- Site Admin
- Posts: 42739
- Joined: Thu Oct 06, 2005 5:47 pm
- Location: Spain
- Has thanked: 105 times
- Been thanked: 108 times
- Contact:
Re: TensorFlow Multilayer perceptron
Creating a bitmap from a text:
https://github.com/dmesquita/understanding_tensorflow_nn
Turned into Harbour code:
https://github.com/dmesquita/understanding_tensorflow_nn
Turned into Harbour code:
Code: Select all | Expand
#include "FiveWin.ch"
function Main()
local hVocabulary := hb_Hash(), hWordToIndex
local cText := "Hi from Brazil"
local cWord, aMatrix
for each cWord in TextSplit( cText )
if hb_HHasKey( hVocabulary, Lower( cWord ) )
hVocabulary[ Lower( cWord ) ] += 1
else
hVocabulary[ Lower( cWord ) ] = 1
endif
next
XBrowser( hVocabulary )
XBrowser( hWordToIndex := WordToIndex( hVocabulary ) )
aMatrix = Array( Len( hVocabulary ) )
AEval( aMatrix, { | n, i | aMatrix[ i ] := 0 } )
for each cWord in TextSplit( cText )
aMatrix[ hWordToIndex[ Lower( cWord ) ] ] += 1
next
XBrowser( aMatrix )
return nil
function TextSplit( cText )
local n, aTokens := {}
for n = 1 to NumToken( cText )
AAdd( aTokens, Token( cText,, n ) )
next
return aTokens
function WordToIndex( hVocabulary )
local hWordToIndex := hb_Hash()
local n
for n = 1 to Len( hVocabulary )
hWordToIndex[ hb_HKeyAt( hVocabulary, n ) ] = n
next
return hWordToIndex
- Antonio Linares
- Site Admin
- Posts: 42739
- Joined: Thu Oct 06, 2005 5:47 pm
- Location: Spain
- Has thanked: 105 times
- Been thanked: 108 times
- Contact:
Re: TensorFlow Multilayer perceptron
Meet the Robot Writing ‘Friends’ Sequels
After I read this:
http://www.thedailybeast.com/meet-the-robot-writing-friends-sequels
I got quite curious to understand how he did it:
This seems different to what I previously posted on this thread, but it seems clear that words (and text) must be turned into a bitmap so the neuronal network can process them.
Imagine a neuronal network learning from already existing code and writting the next code for you
I appreciate if you share your ideas about it
After I read this:
http://www.thedailybeast.com/meet-the-robot-writing-friends-sequels
I got quite curious to understand how he did it:
“It works by predicting the next letter to follow a given sequence of letters, and the predictions are determined by what it learned about language from the Friends dialogue provided,”
This seems different to what I previously posted on this thread, but it seems clear that words (and text) must be turned into a bitmap so the neuronal network can process them.
Imagine a neuronal network learning from already existing code and writting the next code for you

I appreciate if you share your ideas about it
