Following the suggestions and comments of Philip Malchak and Sheani123, I implemented a neural network in a tensor stream to check what happens when we try to teach it to predict (and interpolate) the 2nd square.
Continuous Interval Training
I trained the network on the interval [-7, 7] (taking 300 points inside this interval to make it continuous), and then tested it on the interval [-30, 30]. The activation functions are ReLu, and there are 3 hidden layers on the network, each of which has a size of 50. epochs = 500. The result is shown in the figure below. 
Thus, basically, inside (and also close) to the interval [-7, 7], the fit is perfectly perfect, and then it continues more or less linearly outside. It's nice to see that at least at the initial stage, the slope of the network output is trying to "match" the slope x^2 . If we increase the testing interval, the two graphs diverge greatly, as can be seen in the figure below:

Even number training
Finally, if instead I train the network on the set of all even numbers in the interval [-100, 100] and apply it to the set of all integers (even and odd) in this interval, I get: 
When training the network to create the image higher, I increased the era to 2500 to get better accuracy. Other parameters remained unchanged. Thus, it seems that interpolation βinsideβ the training interval works pretty well (perhaps with the exception of the area around 0, where the fit is a little worse).
Here is the code I used for the first drawing:
import tensorflow as tf import matplotlib.pyplot as plt import numpy as np from tensorflow.python.framework.ops import reset_default_graph