34915@AAAI

Total: 1

#1 MetaSymNet: A Tree-like Symbol Network with Adaptive Architecture and Activation Functions [PDF1] [Copy] [Kimi] [REL]

Authors: Yanjie Li, Weijun Li, Lina Yu, Min Wu, Jingyi Liu, Shu Wei, Yusong Deng, Meilan Hao

Mathematical formulas are the language of communication between humans and nature. Discovering latent formulas from observed data is an important challenge in artificial intelligence, commonly known as symbolic regression(SR). The current mainstream SR algorithms regard SR as a combinatorial optimization problem and use Genetic Programming (GP) or Reinforcement Learning (RL) to solve the SR problem. These methods perform well on simple problems, but poorly on slightly more complex tasks. In addition, this class of algorithms ignores an important aspect: in SR tasks, symbols have explicit numerical meaning. So can we take full advantage of this important property and try to solve the SR problem with more efficient numerical optimization methods? Extrapolation and Learning Equation (EQL) replaces activation functions in neural networks with basic symbols and sparsifies connections to derive a simplified expression from a large network. However, EQL's fixed network structure can't adapt to the complexity of different tasks, often resulting in redundancy or insufficient, limiting its effectiveness. Based on the above analysis, we propose MetaSymNet, a tree-like network that employs the PANGU meta-function as its activation function. PANGU meta-function can evolve into various candidate functions during training. The network structure can also be adaptively adjusted according to different tasks. Then the symbol network evolves into a concise, interpretable mathematical expression. To evaluate the performance of MetaSymNet and five baseline algorithms, we conducted experiments across more than ten datasets, including SRBench. The experimental results show that MetaSymNet has achieved relatively excellent results on various evaluation metrics.

Subject: AAAI.2025 - Search and Optimization