pith. machine review for the scientific record. sign in

arxiv: 2511.07329 · v3 · submitted 2025-11-10 · 💻 cs.LG · cs.CV

Recognition: unknown

Preparation of Fractal-Inspired Computational Architectures for Automated Neural Design Exploration

Authors on Pith no claims yet
classification 💻 cs.LG cs.CV
keywords architecturesautomatedcomputationaldesignefficientexplorationfractalfractal-inspired
0
0 comments X
read the original abstract

It introduces FractalNet, a fractal-inspired computational architectures for advanced large language model analysis that mainly challenges model diversity on a large scale in an efficient manner. The new set-up involves a template-driven generator, runner, and evaluation framework that, through systematic permutations of convolutional, normalization, activation, and dropout layers, can create more than 1,200 variants of neural networks. Fractal templates allow for structural recursion and multi-column pathways, thus, models become deeper and wider in a balanced way. Training utilizes PyTorch, Automatic Mixed Precision (AMP), and gradient checkpointing and is carried out on the CIFAR-10 dataset for five epochs. The outcomes show that fractal-based architectures are capable of strong performance and are computationally efficient. The paper positions fractal design as a feasible and resource-efficient method of automated architecture exploration.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 1 Pith paper

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. Delta-Based Neural Architecture Search: LLM Fine-Tuning via Code Diffs

    cs.LG 2026-05 unverdicted novelty 7.0

    Fine-tuned 7B LLMs generating unified diffs for neural architecture refinement achieve 66-75% valid rates and 64-66% mean first-epoch accuracy, outperforming full-generation baselines by large margins while cutting ou...