Training model...
Step Train loss Test loss Test metric
0 [2.47e+01, 6.13e-01, 4.66e-02, 2.29e+00, 1.27e+00, 4.02e+00] [1.51e+01, 6.13e-01, 4.66e-02, 2.29e+00, 1.27e+00, 4.02e+00] [4.01e+00, 7.23e-01]
200 [1.41e+00, 4.07e-02, 3.24e-02, 1.20e-01, 1.07e-02, 4.25e-01] [7.80e-01, 4.07e-02, 3.24e-02, 1.20e-01, 1.07e-02, 4.25e-01] [2.71e-01, 1.88e-01]
400 [4.45e-02, 3.35e-03, 3.68e-03, 4.82e-03, 5.70e-03, 6.48e-03] [4.22e-02, 3.35e-03, 3.68e-03, 4.82e-03, 5.70e-03, 6.48e-03] [1.52e-02, 4.45e-02]
600 [1.63e-02, 2.23e-03, 1.35e-03, 2.39e-03, 1.92e-03, 3.83e-03] [1.67e-02, 2.23e-03, 1.35e-03, 2.39e-03, 1.92e-03, 3.83e-03] [1.23e-02, 4.00e-02]
800 [1.00e-02, 1.60e-03, 6.82e-04, 1.55e-03, 8.09e-04, 2.26e-03] [9.13e-03, 1.60e-03, 6.82e-04, 1.55e-03, 8.09e-04, 2.26e-03] [1.11e-02, 3.81e-02]
1000 [7.20e-03, 1.17e-03, 4.14e-04, 1.12e-03, 4.42e-04, 1.34e-03] [5.96e-03, 1.17e-03, 4.14e-04, 1.12e-03, 4.42e-04, 1.34e-03] [1.06e-02, 3.72e-02]
1200 [5.76e-03, 8.92e-04, 2.74e-04, 8.05e-04, 3.09e-04, 8.54e-04] [4.42e-03, 8.92e-04, 2.74e-04, 8.05e-04, 3.09e-04, 8.54e-04] [1.03e-02, 3.67e-02]
1400 [1.00e-02, 6.79e-04, 2.20e-04, 6.86e-04, 3.24e-04, 8.32e-04] [5.89e-03, 6.79e-04, 2.20e-04, 6.86e-04, 3.24e-04, 8.32e-04] [1.10e-02, 3.79e-02]
1600 [3.90e-03, 5.66e-04, 1.59e-04, 5.20e-04, 2.35e-04, 4.24e-04] [3.35e-03, 5.66e-04, 1.59e-04, 5.20e-04, 2.35e-04, 4.24e-04] [1.03e-02, 3.66e-02]
1800 [2.92e-03, 4.59e-04, 1.31e-04, 3.44e-04, 2.13e-04, 3.30e-04] [2.67e-03, 4.59e-04, 1.31e-04, 3.44e-04, 2.13e-04, 3.30e-04] [1.02e-02, 3.65e-02]
2000 [2.43e-03, 3.85e-04, 1.13e-04, 2.61e-04, 1.99e-04, 2.60e-04] [2.38e-03, 3.85e-04, 1.13e-04, 2.61e-04, 1.99e-04, 2.60e-04] [1.02e-02, 3.64e-02]
2200 [2.10e-03, 3.26e-04, 1.05e-04, 1.96e-04, 1.85e-04, 2.16e-04] [2.04e-03, 3.26e-04, 1.05e-04, 1.96e-04, 1.85e-04, 2.16e-04] [1.02e-02, 3.65e-02]
2400 [1.47e-02, 2.77e-04, 9.92e-05, 8.40e-04, 1.85e-04, 5.94e-04] [5.17e-03, 2.77e-04, 9.92e-05, 8.40e-04, 1.85e-04, 5.94e-04] [1.02e-02, 3.64e-02]
2600 [1.56e-03, 2.54e-04, 9.24e-05, 1.35e-04, 1.66e-04, 1.51e-04] [1.60e-03, 2.54e-04, 9.24e-05, 1.35e-04, 1.66e-04, 1.51e-04] [1.02e-02, 3.64e-02]
2800 [1.49e-03, 2.28e-04, 9.11e-05, 1.18e-04, 1.58e-04, 1.33e-04] [1.44e-03, 2.28e-04, 9.11e-05, 1.18e-04, 1.58e-04, 1.33e-04] [1.03e-02, 3.67e-02]
3000 [1.24e-03, 2.09e-04, 8.64e-05, 1.02e-04, 1.50e-04, 1.16e-04] [1.28e-03, 2.09e-04, 8.64e-05, 1.02e-04, 1.50e-04, 1.16e-04] [1.02e-02, 3.64e-02]
3200 [1.25e-03, 1.91e-04, 8.42e-05, 9.19e-05, 1.42e-04, 1.11e-04] [1.11e-03, 1.91e-04, 8.42e-05, 9.19e-05, 1.42e-04, 1.11e-04] [1.02e-02, 3.64e-02]
3400 [1.38e-03, 1.80e-04, 8.03e-05, 9.97e-05, 1.39e-04, 1.14e-04] [1.27e-03, 1.80e-04, 8.03e-05, 9.97e-05, 1.39e-04, 1.14e-04] [1.01e-02, 3.62e-02]
3600 [9.33e-04, 1.66e-04, 7.85e-05, 7.66e-05, 1.30e-04, 8.64e-05] [9.51e-04, 1.66e-04, 7.85e-05, 7.66e-05, 1.30e-04, 8.64e-05] [1.03e-02, 3.66e-02]
3800 [8.96e-04, 1.56e-04, 7.63e-05, 7.26e-05, 1.22e-04, 8.32e-05] [9.11e-04, 1.56e-04, 7.63e-05, 7.26e-05, 1.22e-04, 8.32e-05] [1.02e-02, 3.65e-02]
4000 [8.08e-04, 1.47e-04, 7.39e-05, 6.76e-05, 1.17e-04, 7.53e-05] [8.14e-04, 1.47e-04, 7.39e-05, 6.76e-05, 1.17e-04, 7.53e-05] [1.03e-02, 3.67e-02]
4200 [7.85e-04, 1.39e-04, 7.15e-05, 6.59e-05, 1.11e-04, 7.08e-05] [7.96e-04, 1.39e-04, 7.15e-05, 6.59e-05, 1.11e-04, 7.08e-05] [1.04e-02, 3.67e-02]
4400 [7.14e-04, 1.32e-04, 6.92e-05, 6.18e-05, 1.04e-04, 6.85e-05] [6.93e-04, 1.32e-04, 6.92e-05, 6.18e-05, 1.04e-04, 6.85e-05] [1.03e-02, 3.67e-02]
4600 [6.74e-04, 1.26e-04, 6.67e-05, 5.94e-05, 9.90e-05, 6.56e-05] [6.59e-04, 1.26e-04, 6.67e-05, 5.94e-05, 9.90e-05, 6.56e-05] [1.03e-02, 3.67e-02]
4800 [6.49e-04, 1.21e-04, 6.48e-05, 5.85e-05, 9.22e-05, 6.34e-05] [6.32e-04, 1.21e-04, 6.48e-05, 5.85e-05, 9.22e-05, 6.34e-05] [1.03e-02, 3.67e-02]
5000 [8.01e-03, 1.16e-04, 6.24e-05, 4.47e-04, 1.05e-04, 2.51e-04] [2.54e-03, 1.16e-04, 6.24e-05, 4.47e-04, 1.05e-04, 2.51e-04] [1.01e-02, 3.63e-02]
5200 [7.13e-04, 1.11e-04, 6.06e-05, 5.79e-05, 8.24e-05, 6.51e-05] [5.39e-04, 1.11e-04, 6.06e-05, 5.79e-05, 8.24e-05, 6.51e-05] [1.03e-02, 3.67e-02]
5400 [8.33e-03, 1.09e-04, 5.82e-05, 5.97e-04, 8.29e-05, 2.90e-04] [3.31e-03, 1.09e-04, 5.82e-05, 5.97e-04, 8.29e-05, 2.90e-04] [1.07e-02, 3.74e-02]
5600 [5.26e-04, 1.04e-04, 5.62e-05, 5.21e-05, 7.47e-05, 5.50e-05] [4.86e-04, 1.04e-04, 5.62e-05, 5.21e-05, 7.47e-05, 5.50e-05] [1.04e-02, 3.68e-02]
5800 [5.26e-04, 9.98e-05, 5.51e-05, 5.41e-05, 7.05e-05, 5.28e-05] [4.47e-04, 9.98e-05, 5.51e-05, 5.41e-05, 7.05e-05, 5.28e-05] [1.04e-02, 3.69e-02]
6000 [1.71e-03, 9.98e-05, 5.20e-05, 8.77e-05, 8.02e-05, 1.29e-04] [1.06e-03, 9.98e-05, 5.20e-05, 8.77e-05, 8.02e-05, 1.29e-04] [9.97e-03, 3.61e-02]
6200 [4.62e-04, 9.37e-05, 5.14e-05, 4.85e-05, 6.32e-05, 5.06e-05] [4.23e-04, 9.37e-05, 5.14e-05, 4.85e-05, 6.32e-05, 5.06e-05] [1.04e-02, 3.68e-02]
6400 [1.24e-02, 1.04e-04, 6.88e-05, 4.85e-04, 1.62e-04, 5.38e-04] [4.89e-03, 1.04e-04, 6.88e-05, 4.85e-04, 1.62e-04, 5.38e-04] [1.23e-02, 4.01e-02]
6600 [4.29e-04, 8.78e-05, 4.89e-05, 4.62e-05, 5.66e-05, 4.85e-05] [3.87e-04, 8.78e-05, 4.89e-05, 4.62e-05, 5.66e-05, 4.85e-05] [1.04e-02, 3.68e-02]
6800 [4.44e-04, 8.54e-05, 4.80e-05, 5.00e-05, 5.34e-05, 4.67e-05] [3.52e-04, 8.54e-05, 4.80e-05, 5.00e-05, 5.34e-05, 4.67e-05] [1.04e-02, 3.69e-02]
7000 [4.45e-04, 8.20e-05, 4.68e-05, 4.90e-05, 5.06e-05, 4.75e-05] [3.42e-04, 8.20e-05, 4.68e-05, 4.90e-05, 5.06e-05, 4.75e-05] [1.04e-02, 3.68e-02]
7200 [1.10e-03, 8.01e-05, 4.55e-05, 8.02e-05, 4.90e-05, 6.38e-05] [6.57e-04, 8.01e-05, 4.55e-05, 8.02e-05, 4.90e-05, 6.38e-05] [1.06e-02, 3.71e-02]
7400 [3.90e-04, 7.74e-05, 4.49e-05, 4.40e-05, 4.53e-05, 4.37e-05] [3.22e-04, 7.74e-05, 4.49e-05, 4.40e-05, 4.53e-05, 4.37e-05] [1.04e-02, 3.68e-02]
7600 [9.62e-04, 7.53e-05, 4.44e-05, 8.08e-05, 4.29e-05, 5.68e-05] [4.38e-04, 7.53e-05, 4.44e-05, 8.08e-05, 4.29e-05, 5.68e-05] [1.03e-02, 3.66e-02]
7800 [4.29e-04, 7.39e-05, 4.34e-05, 4.04e-05, 4.16e-05, 3.93e-05] [3.31e-04, 7.39e-05, 4.34e-05, 4.04e-05, 4.16e-05, 3.93e-05] [1.05e-02, 3.70e-02]
8000 [4.22e-04, 7.15e-05, 4.22e-05, 4.16e-05, 3.97e-05, 4.45e-05] [3.48e-04, 7.15e-05, 4.22e-05, 4.16e-05, 3.97e-05, 4.45e-05] [1.05e-02, 3.70e-02]
8200 [3.28e-04, 6.97e-05, 4.14e-05, 3.75e-05, 3.80e-05, 4.21e-05] [2.86e-04, 6.97e-05, 4.14e-05, 3.75e-05, 3.80e-05, 4.21e-05] [1.04e-02, 3.69e-02]
8400 [3.18e-04, 6.78e-05, 4.07e-05, 3.66e-05, 3.62e-05, 4.11e-05] [2.75e-04, 6.78e-05, 4.07e-05, 3.66e-05, 3.62e-05, 4.11e-05] [1.04e-02, 3.69e-02]
8600 [3.30e-03, 6.64e-05, 3.93e-05, 2.54e-04, 3.58e-05, 1.37e-04] [1.34e-03, 6.64e-05, 3.93e-05, 2.54e-04, 3.58e-05, 1.37e-04] [1.07e-02, 3.73e-02]
8800 [3.50e-04, 6.45e-05, 3.97e-05, 3.51e-05, 3.28e-05, 4.25e-05] [2.78e-04, 6.45e-05, 3.97e-05, 3.51e-05, 3.28e-05, 4.25e-05] [1.04e-02, 3.68e-02]
9000 [3.08e-04, 6.27e-05, 3.88e-05, 3.51e-05, 3.11e-05, 3.93e-05] [2.48e-04, 6.27e-05, 3.88e-05, 3.51e-05, 3.11e-05, 3.93e-05] [1.04e-02, 3.69e-02]
9200 [2.91e-04, 6.13e-05, 3.81e-05, 3.25e-05, 3.01e-05, 3.88e-05] [2.51e-04, 6.13e-05, 3.81e-05, 3.25e-05, 3.01e-05, 3.88e-05] [1.05e-02, 3.70e-02]
9400 [5.16e-04, 5.99e-05, 3.81e-05, 3.75e-05, 3.09e-05, 3.43e-05] [3.00e-04, 5.99e-05, 3.81e-05, 3.75e-05, 3.09e-05, 3.43e-05] [1.06e-02, 3.72e-02]
9600 [2.68e-04, 5.85e-05, 3.69e-05, 3.07e-05, 2.79e-05, 3.72e-05] [2.30e-04, 5.85e-05, 3.69e-05, 3.07e-05, 2.79e-05, 3.72e-05] [1.05e-02, 3.70e-02]
9800 [2.81e-04, 5.66e-05, 3.64e-05, 3.01e-05, 2.66e-05, 3.80e-05] [2.29e-04, 5.66e-05, 3.64e-05, 3.01e-05, 2.66e-05, 3.80e-05] [1.05e-02, 3.70e-02]
10000 [2.59e-04, 5.57e-05, 3.58e-05, 2.99e-05, 2.55e-05, 3.60e-05] [2.13e-04, 5.57e-05, 3.58e-05, 2.99e-05, 2.55e-05, 3.60e-05] [1.05e-02, 3.70e-02]
10200 [3.35e-04, 5.45e-05, 3.52e-05, 3.16e-05, 2.46e-05, 3.92e-05] [2.52e-04, 5.45e-05, 3.52e-05, 3.16e-05, 2.46e-05, 3.92e-05] [1.05e-02, 3.70e-02]
Best model at step 10000:
train loss: 4.42e-04
test loss: 3.96e-04
test metric: [1.05e-02, 3.70e-02]
'train' took 170.610247 s
Hello. I apologize if there have been a similar topic already, but I have not found it in these issues.
I have been experimenting with a simple time-dependent equation in a rectangular 2D domain, and have found out a strange behavior of BC losses. Or maybe I am doing something wrong.
The image of the PDE system
Below is the code divided into blocks:
Imports and random seed setting
Domain limits
PDE, BC, IC, derivatives and exact solution functions
on_boundarychecking functions with corner points filtersGeometry, boundary and initial conditions
TimePDE object, neural network and model wrapper
Compile and run
When
PDEPointResampleris disabled, everything's OK: the test losses and train losses have the same scale, the metrics seem to be alright (forLR=1e-3).Training log when
PDEPointResampleris disabledBut when$y=+\pi$ . But the metrics are still the same.
PDEPointResampleris enabled, some of the test losses grow very much. Here especially the 4th boundary condition - the Nuemann condition forTraining log when
PDEPointResampleris enabledI have tried to increase of decrease the training points number, change the resampling frequency - still the same: if I get convergence according to the training loss and metrics, I still get bad test losses.
I have compared the predicted and true derivative at$y=+\pi$ for the regular grid, and it looks almost the same.
The code
The output
I have also calculated this loss term for points from the training and the testing set of, as far as I understand, the last model state. I have tried to take the points from 2 different places . The losses are low.
The code and the output
The code:
The output:
Another way to calculate the loss: with
model._outputs_losses(). The results is the same as in the last entry of the training log.The code and the output
The code:
The output:
I was analysing this behavior with Gemini, DeepSeek and Qwen Code, and the latter told me to check the BC error calculation cycle in
PDE.losses()indeepxde/data/pde.py. Indeed, it seems that for calculation of test losses train points are sent tobc.error()along with actual inputs and outputs. There arebegandendpointers which must select the correct span from the train points array, but probably after resampling something breaks?I have created a copy of NeumannBC class, and added some debug output there:
Xandinputs; values ofbegandendXandinputafter[beg : end]elements were chosen (i.e. filtered)Xandinputelements were chosenXandinput(filtered) and of difference betweenfunc(X_filtered)andfunc(inputs_filtered)As far as I understand, the filtered
inputsand filteredXmust be the same, as they are used to calculate derivative values at boundary points and normal values at these points. But the difference is non-zero.The code
The output
If I set
SEND_AS_XandSEND_AS_INPUTSto be the same (eitherdata.test_xordata.train_x), or if I disable thePDEPointResampler, then the difference goes away.The output
Am I doing anything wrong, or can there be a bug inside DeepXDE resampling or loss calculation logic?
The whole Jupyter Notebook