Informatica logo


Login Register

  1. Home
  2. To appear
  3. Novel Loss Function Construction for Neu ...

Informatica

Information Submit your article For Referees Help ATTENTION!
  • Article info
  • Full article
  • Related articles
  • More
    Article info Full article Related articles

Novel Loss Function Construction for Neural Network-Based Prediction of Complex High-Dimensional Nonlinear Dynamical Systems
Kun An   Ying Sun ORCID icon link to view author Ying Sun details   Minghui Yao   Junhua Zhang  

Authors

 
Placeholder
https://doi.org/10.15388/25-INFOR607
Pub. online: 28 October 2025      Type: Research Article      Open accessOpen Access

Received
1 May 2025
Accepted
1 October 2025
Published
28 October 2025

Abstract

Traditional loss functions such as mean squared error (MSE) are widely employed, but they often struggle to capture the dynamic characteristics of high-dimensional nonlinear systems. To address this issue, we propose an improved loss function that integrates linear multistep methods, system-consistency constraints, and prediction-phase error control. This construction simultaneously improves training accuracy and long-term stability. Furthermore, the introduction of recursive loss and interpolation strategies brings the model closer to practical prediction scenarios, broadening its applicability. Numerical simulations demonstrate that this construction significantly outperforms both mean square error and existing custom loss functions in terms of performance.

References

 
Akçakaya, M., Moeller, S., Weingärtner, S., Uğurbil, K. (2019). Scan-specific robust artificial-neural-networks for k-space interpolation (RAKI) reconstruction: database-free deep learning for fast imaging. Magnetic Resonance in Medicine, 81, 439–453.
 
Calvo-Rolle, J.L., Fontenla-Romero, O., Pérez-Sánchez, B., Guijarro-Berdiñas, B. (2014). Adaptive inverse control using an online learning algorithm for neural networks. Informatica, 25(3), 401–414.
 
Chopra, S., Hadsell, R., LeCun, Y. (2005). Learning a similarity metric discriminatively, with application to face verification. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Vol. 1, pp. 539–546.
 
D’Ambrosio, A., Schiassi, E., Curti, F., Furfaro, R. (2021). Pontryagin neural networks with functional interpolation for optimal intercept problems. Mathematics, 9(9), 996.
 
Fernandes, B., Silva, F., Alaiz-Moreton, H., Novais, P., Neves, J., Analide, C. (2020). Long short-term memory networks for traffic flow forecasting: exploring input variables, time frames and multi-step approaches. Informatica, 31(4), 723–749.
 
Ghazvini, A., Sharef, N.M., Sidi, F.B. (2024). Prediction of course grades in computer science higher education program via a combination of loss functions in LSTM model. IEEE Access, 12, 30220–30241.
 
Hadsell, R., Chopra, S., LeCun, Y. (2006). Dimensionality reduction by learning an invariant mapping. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Vol. 2, pp. 1735–1742.
 
He, J., Li, X.X., Zhu, H.Q. (2023). An adaptive discrete physics-informed neural network method for solving the Cahn-Hilliard equation. Engineering Analysis with Boundary Elements, 155, 826–838.
 
Huber, P.J. (1964). Robust estimation of a location parameter. Annals of Mathematical Statistics, 35(1), 73–101.
 
Kendall, A., Gal, Y., Cipolla, R. (2018). Multi-task learning using uncertainty to weigh losses for scene geometry and semantics. In: 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 7482–7491.
 
Lamb, A., Verma, V., Kannala, J., Bengio, Y. (2019). Interpolated adversarial training: achieving robust neural networks without sacrificing too much accuracy. In: Proceedings of the 12th ACM Workshop on Artificial Intelligence and Security (AISec’19), pp. 95–103.
 
Lee, H., Lee, J., Cho, S. (2017). View-interpolation of sparsely sampled sinogram using convolutional neural network. In: Medical Imaging 2017: Image Processing, Vol. 10133, p. 1013328.
 
Li, K.J. (2024). MultiPINN: multi-head enriched physics-informed neural networks for differential equations solving. Neural Computing and Applications, 36, 11371–11395.
 
Lin, T.Y., Goyal, P., Girshick, R., He, K., Dollar, P. (2017). Focal loss for dense object detection. IEEE Transactions on Pattern Analysis and Machine Intelligence, 42, 318–327.
 
Liu, T., Meidani, H. (2023). Physics-informed neural networks for system identification of structural systems with a multiphysics damping model. Journal of Engineering Mechanics, 149(10), 04023079.
 
Papineni, K., Roukos, S., Ward, T., Zhu, W.J. (2002). BLEU: a method for automatic evaluation of machine translation. In: Proceedings of the 40th Annual Meeting of the Association for Computational Linguistics, pp. 311–318.
 
Ranzato, M., Chopra, S., Auli, M., Zaremba, W. (2015). Sequence Level Training with Recurrent Neural Networks. arXiv preprint: arXiv:1511.06732.
 
Rendle, S., Freudenthaler, C., Gantner, Z., Schmidt-Thieme, L. (2009). BPR: Bayesian personalized ranking from implicit feedback. In: Proceedings of the 25th Conference on Uncertainty in Artificial Intelligence, pp. 452–461.
 
Rumelhart, D., Hinton, G., Williams, R. (1986). Learning representations by back-propagating errors. Nature, 323, 533–536.
 
Sestanovic, T., Kalinic Milicevic, T. (2025). Identification of the optimal neural network architecture for prediction of Bitcoin return. Informatica, 36(1), 175–196.
 
Shannon, C.E. (1948). A mathematical theory of communication. Bell System Technical Journal, 27(3), 379–423.
 
Sun, Y., Zhang, L.Y., Yao, M.H., Zhang, J.H. (2024). Neural network models and Shapley additive explanations for a beam-ring structure. Chaos, Solitons & Fractals, 185, 115114.
 
Verma, V., Kawaguchi, K., Lamb, A., Kannala, K., Solin, A., Bengio, Y., Lopez-Paz, D. (2022). Interpolation consistency training for semi-supervised learning. Neural Networks, 145, 90–106.
 
Wu, Z., Rincon, D., Christofides, P.D. (2020). Incorporating structural process knowledge in recurrent neural network modeling of nonlinear processes. In: Proceedings of the American Control Conference (ACC), pp. 2413–2418.
 
Xie, X.P., Zhang, G.N., Webster, C.G. (2019). Non-intrusive inference reduced order model for fluids using deep multistep neural network. Mathematics, 7(8), 757.
 
Yang, M., Zhang, Y.N., Hu, H.F. (2020). Discrete ZNN models of Adams-Bashforth (AB) type solving various future problems with motion control of mobile manipulator. Neurocomputing, 384, 84–93.
 
Yin, S.B., Hu, S.H., Wang, Y.B., Yang, Y.H. (2023). High-order Adams Network (HIAN) for image dehazing. Applied Soft Computing, 139, 110204.
 
Yuan, Q.W., He, J.W., Yu, L., Zheng, G. (2020). Aim-Net: bring implicit Euler to network design. In: 2020 IEEE International Conference on Image Processing (ICIP), pp. 1926–1930.
 
Zhai, W.D., Tao, D.W., Bao, Y.Q. (2023). Parameter estimation and modeling of nonlinear dynamical systems based on Runge-Kutta physics-informed neural network. Nonlinear Dynamics, 111, 21117–21130.
 
Zhang, L.Y., Sun, Y., Wang, A.W., Zhang, J.H. (2023). Neural network modeling and dynamic behavior prediction of nonlinear dynamic systems. Nonlinear Dynamics, 111, 11335–11356.
 
Zhang, W., Wu, R.Q., Behdinan, K. (2019). Nonlinear dynamic analysis near resonance of a beam-ring structure for modeling circular truss antenna under time-dependent thermal excitation. Aerospace Science and Technology, 86, 296–311.
 
Zhou, H., Zhang, S., Peng, J., Zhang, S., Li, J., Xiong, H., Zhang, W. (2021). Informer: beyond efficient transformer for long sequence time-series forecasting. In: Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 35, pp. 11106–11115.
 
Zhu, D., Cheng, X., Zhang, F., Yao, X., Gao, Y., Liu, Y. (2019). Spatial interpolation using conditional generative adversarial neural networks. International Journal of Geographical Information Science, 34(4), 735–758.
 
Zhu, M., Chang, B., Fu, C. (2022). Convolutional neural networks combined with Runge-Kutta methods. Neural Computing and Applications, 35(2), 1629–1643.

Biographies

An Kun
ankun1906@163.com

K. An is a postgraduate student in the School of Applied Science at Beijing Information Science and Technology University, Beijing, China. He completed a bachelor of science degree in Applied Statistics at Xuzhou University of Technology. His research program encompasses data-driven modelling and deep learning.

Sun Ying
https://orcid.org/0000-0001-5611-2800
sunying0000@126.com

Y. Sun was awarded a PhD by Beijing University of Technology. She currently holds the position of associate professor at the School of Applied Science, Beijing Information Science and Technology University. Her research program encompasses data-driven modelling and the analysis of nonlinear dynamical systems, including chaos and bifurcation theory.

Yao Minghui
merry_mingming@163.com

M. Yao is currently a professor at School of Aeronautics and Astronautics, Tiangong University, China. Her research interests mainly focus on nonlinear vibrations of mechanical structures, energy harvesting, self-power sensors and systems and artificial intelligence.

Zhang Junhua
hua@bistu.edu.cn

J. Zhang received the PhD degree in engineering mechanics from Beijing University of Technology in 2009, and the MSc degree in applied mathematics from Hebei University of Technology in 2005. She is currently a professor with the College of Mechanical and Electrical Engineering, Beijing Information Science and Technology University of China. Her research interests include dynamics and control of nonlinear mechanical systems, design of lightweight metamaterial structures.


Full article Related articles PDF XML
Full article Related articles PDF XML

Copyright
© 2025 Vilnius University
by logo by logo
Open access article under the CC BY license.

Keywords
custom loss function neural networks nonlinear dynamical systems time series prediction

Metrics
since January 2020
119

Article info
views

59

Full article
views

33

PDF
downloads

5

XML
downloads

Export citation

Copy and paste formatted citation
Placeholder

Download citation in file


Share


RSS

INFORMATICA

  • Online ISSN: 1822-8844
  • Print ISSN: 0868-4952
  • Copyright © 2023 Vilnius University

About

  • About journal

For contributors

  • OA Policy
  • Submit your article
  • Instructions for Referees
    •  

    •  

Contact us

  • Institute of Data Science and Digital Technologies
  • Vilnius University

    Akademijos St. 4

    08412 Vilnius, Lithuania

    Phone: (+370 5) 2109 338

    E-mail: informatica@mii.vu.lt

    https://informatica.vu.lt/journal/INFORMATICA
Powered by PubliMill  •  Privacy policy