How to implement skylarn PolynomialFeatures in a tensor stream?

I am trying to implement scikit-learn PolynomialFeatures as a layer in a neural network with a direct connection in a tensor stream and Keras. For simplicity, I will give an example using NumPy arrays. If a batch has three samples, and activation of a certain layer is equal to a (3, 2) -shaped matrix

>>> X = np.arange(0, 6).reshape(2, 3)
>>> X
array([[0, 1],
       [2, 3],
       [4, 5]])

then I would like the activation in the next layer to be equal to the extension of the function of a polynomial of degree 2 with X:

>>> from sklearn.preprocessing import PolynomialFeatures
>>> PolynomialFeatures(degree=2).fit_transform(X)
array([[  1.,   0.,   1.,   0.,   0.,   1.],
       [  1.,   2.,   3.,   4.,   6.,   9.],
       [  1.,   4.,   5.,  16.,  20.,  25.]])

That is, if the activation of the layer i is the matrix X(of the form (batch_size, num_features)), then to select the parameter degree=2I would like the activation of the layer i + 1 to be a concatenation

  • Column batch_sizeMANY 1.,
  • X,
  • X: X[:, 0] * X[:, 0], X[:, 0] * X[:, 1] X[:, 1] * X[:, 1].

X:

import keras.backend as K
X = K.reshape(K.arange(0, 6), (3, 2))
with K.get_session().as_default():
    print(K.concatenate([K.pow(X, 0), K.pow(X, 1), K.pow(X, 2)]).eval())

:

[[ 1  1  0  1  0  1]
 [ 1  1  2  3  4  9]
 [ 1  1  4  5 16 25]]

i, 1 ( , , ), X X .

( )? PolynomialFeatures, , tensorflow, , ( axis=1) : XP[:, i] = X[:, c].prod(axis=1), c - , (0, 0, 1).

+6

:

9

:

5116
, ?
4268
?
3790
?
3474
?
3428
?
3235
, ?
2849
?
2621
?
2568
Python

Source: https://habr.com/ru/post/1016424/


All Articles