Smish Activation Function with New Updating Rule in Logic Satisfiability

Keywords: Non-systematic logic, Activation function, Discrete Hopfield Neural Network

Abstract

A deeper understanding of the additional mechanisms in the Discrete Hopfield Neural Network (DHNN) is essential for advancing its application in intelligent systems. This study investigates the performance of the Conditional Random 2 Satisfiability logic (CRAN2SAT) in DHNN (DHNN-CRAN2SAT) in retrieving diverse and optimal final neuron states. The findings show that the Election Algorithm consistently outperforms Exhaustive Search by consistently retrieving global minimum solutions. Additionally, the incorporation of a new updating rule during the retrieval phase enhances the diversity of the retrieved states, as indicated by lower Sokal–Sneath similarity indices and increased neuron state variation. These results highlight the significance of both the learning algorithm and updating strategy in the retrieval phase of DHNN. By enabling a broader range of final neuron states, this approach offers meaningful improvements for logic mining models, particularly in addressing real-world classification challenges.

Downloads

Download data is not yet available.

References

[1] J. J. Hopfield and D. W. Tank, “‘Neural’ computation of decisions in optimization problems,” Biol Cybern, vol. 52, no. 3, pp. 141–152, 1985, doi: 10.1007/BF00339943.
[2] Z. T. Njitacke, S. D. Isaac, T. Nestor, and J. Kengne, “Window of multistability and its control in a simple 3D Hopfield neural network: application to biomedical image encryption,” Neural Comput Appl, vol. 33, no. 12, pp. 6733–6752, 2021, doi: 10.1007/s00521-020-05451-z.
[3] N. Soni, E. K. Sharma, and A. Kapoor, “Application of hopfield neural network for facial image recognition,” International Journal of Recent Technology and Engineering, vol. 8, no. 1, pp. 3101–3105, 2019.
[4] Z. Fahimi, M. R. Mahmoodi, H. Nili, V. Polishchuk, and D. B. Strukov, “Combinatorial optimization by weight annealing in memristive hopfield networks,” Sci Rep, vol. 11, no. 1, 2021, doi: 10.1038/s41598-020-78944-5.
[5] W. A. T. W. Abdullah, “Logic Programming on a Neural Network,” International Journal of Intelligent Systems, vol. 7, no. 6, pp. 513–519, 1992, doi: 10.1002/int.4550070604.
[6] S. Sathasivam, “Upgrading Logic Programming in Hopfield Network,” Sains Malays, vol. 39, no. 1, pp. 115–118, 2010.
[7] M. S. M. Kasihmuddin, M. A. Mansor, and S. Sathasivam, “Hybrid Genetic Algorithm in the Hopfield Network for Logic Satisfiability Problem,” Pertanika J Sci Technol, vol. 25, no. 1, pp. 139–152, 2017.
[8] M. A. Mansor, M. S. M. Kasihmuddin, and S. Sathasivam, “Artificial immune system paradigm in the hopfield network for 3-satisfiability problem,” Pertanika J Sci Technol, vol. 25, no. 4, pp. 1173–1188, 2017.
[9] S. Sathasivam, M. A. Mansor, A. I. M. Ismail, S. Z. M. Jamaludin, M. S. M. Kasihmuddin, and M. Mamat, “Novel Random k Satisfiability for k ≤ 2 in Hopfield Neural Network,” Sains Malays, vol. 49, no. 11, pp. 2847–2857, Nov. 2020, doi: 10.17576/jsm-2020-4911-23.
[10] S. A. Karim et al., “Random satisfiability: A higher-order logical approach in discrete hopfield neural network,” IEEE Access, vol. 9, pp. 50831–50845, 2021, doi: 10.1109/ACCESS.2021.3068998.
[11] A. Alway, N. E. Zamri, S. A. Karim, M. A. Mansor, M. S. Mohd Kasihmuddin, and M. Mohammed Bazuhair, “Major 2 Satisfiability Logic in Discrete Hopfield Neural Network,” Int J Comput Math, vol. 99, no. 5, pp. 924–948, 2022, doi: 10.1080/00207160.2021.1939870.
[12] N. Roslan and S. Sathasivam, “Analyzing performance of activation functions in logic satisfiability hopfield neural network,” in AIP Conference Proceedings, American Institute of Physics, Aug. 2024. doi: 10.1063/5.0223811.
[13] N. ‘Afifah Rusdi et al., “Synergizing intelligence and knowledge discovery: Hybrid black hole algorithm for optimizing discrete Hopfield neural network with negative based systematic satisfiability,” AIMS Mathematics, vol. 9, no. 11, pp. 29820–29882, 2024, doi: 10.3934/math.20241444.
[14] M. A. Mansor and S. Sathasivam, “Accelerating activation function for 3-satisfiability logic programming,” International Journal of Intelligent Systems and Applications, vol. 8, no. 10, pp. 44–50, Oct. 2016, doi: 10.5815/ijisa.2016.10.05.
[15] S. R. Dubey, S. K. Singh, and B. B. Chaudhuri, “Activation Functions in Deep Learning: A comprehensive Survey and Benchmark,” Sep. 07, 2022, Elsevier B.V. doi: 10.1016/j.neucom.2022.06.111.
[16] T. Szandała, “Review and comparison of commonly used activation functions for deep neural networks,” in Studies in Computational Intelligence, vol. 903, 2021. doi: 10.1007/978-981-15-5495-7_11.
[17] X. Wang, H. Ren, and A. Wang, “Smish: A Novel Activation Function for Deep Learning Methods,” Electronics (Switzerland), vol. 11, no. 4, p. 540, Feb. 2022, doi: 10.3390/electronics11040540.
[18] N. Roslan, S. Sathasivam, and F. L. Azizan, “Conditional random k satisfiability modeling for k =1,2 (CRAN2SAT) with non-monotonic Smish activation function in discrete Hopfield neural network,” AIMS Mathematics, vol. 9, no. 2, pp. 3911–3956, 2024, doi: 10.3934/math.2024193.
[19] Y. Guo et al., “Dual optimization approach in discrete Hopfield neural network,” Appl Soft Comput, vol. 164, Oct. 2024, doi: 10.1016/j.asoc.2024.111929.
[20] B. Yang et al., “BRAN2SAT: Redundant satisfiability logic in Lyapunov-based discrete Hopfield neural network,” J Comput Des Eng, vol. 12, no. 4, pp. 185–204, Apr. 2025, doi: 10.1093/jcde/qwaf039.
[21] N. E. Zamri, S. A. Azhar, M. A. Mansor, A. Alway, and M. S. M. Kasihmuddin, “Weighted Random k Satisfiability for k=1,2 (r2SAT) in Discrete Hopfield Neural Network,” Appl Soft Comput, vol. 126, p. 109312, Sep. 2022, doi: 10.1016/j.asoc.2022.109312.
[22] F. L. Azizan, S. Sathasivam, M. K. M. Ali, N. Roslan, and C. Feng, “Hybridised Network of Fuzzy Logic and a Genetic Algorithm in Solving 3-Satisfiability Hopfield Neural Networks,” Axioms, vol. 12, no. 3, p. 250, Mar. 2023, doi: 10.3390/axioms12030250.
[23] S. Sathasivam, M. A. Mansor, M. S. M. Kasihmuddin, and H. Abubakar, “Election Algorithm for Random k Satisfiability in the Hopfield Neural Network,” Processes, vol. 8, no. 5, p. 568, May 2020, doi: 10.3390/PR8050568.
[24] M. S. M. Kasihmuddin, M. A. Mansor, M. F. M. Basir, and S. Sathasivam, “Discrete Mutation Hopfield Neural Network in Propositional Satisfiability,” Mathematics, vol. 7, no. 11, p. 1133, Nov. 2019, doi: 10.3390/MATH7111133.
Published
2026-04-08
How to Cite
[1]
N. Roslan and S. Sathasivam, “Smish Activation Function with New Updating Rule in Logic Satisfiability”, BAREKENG: J. Math. & App., vol. 20, no. 3, pp. 2349-2362, Apr. 2026.