pith. machine review for the scientific record. sign in

arxiv: 1705.09792 · v4 · submitted 2017-05-27 · 💻 cs.NE · cs.LG

Recognition: unknown

Deep Complex Networks

Authors on Pith no claims yet
classification 💻 cs.NE cs.LG
keywords complexdeepnetworksneuralcomplex-valuedmodelsarchitecturesblocks
0
0 comments X
read the original abstract

At present, the vast majority of building blocks, techniques, and architectures for deep learning are based on real-valued operations and representations. However, recent work on recurrent neural networks and older fundamental theoretical analysis suggests that complex numbers could have a richer representational capacity and could also facilitate noise-robust memory retrieval mechanisms. Despite their attractive properties and potential for opening up entirely new neural architectures, complex-valued deep neural networks have been marginalized due to the absence of the building blocks required to design such models. In this work, we provide the key atomic components for complex-valued deep neural networks and apply them to convolutional feed-forward networks and convolutional LSTMs. More precisely, we rely on complex convolutions and present algorithms for complex batch-normalization, complex weight initialization strategies for complex-valued neural nets and we use them in experiments with end-to-end training schemes. We demonstrate that such complex-valued models are competitive with their real-valued counterparts. We test deep complex models on several computer vision tasks, on music transcription using the MusicNet dataset and on Speech Spectrum Prediction using the TIMIT dataset. We achieve state-of-the-art performance on these audio-related tasks.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 5 Pith papers

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. Complex-Valued Phase-Coherent Transformer

    cs.LG 2026-05 unverdicted novelty 7.0

    PCT replaces softmax token competition with a smooth phase-preserving gate on normalized complex similarities, yielding stronger generalization on long-range and phase-sensitive benchmarks than both real and complex T...

  2. Algorithm and Hardware Co-Design for Efficient Complex-Valued Uncertainty Estimation

    cs.AR 2026-04 unverdicted novelty 7.0

    Proposes dropout-based BayesCVNNs with automated configuration search and FPGA accelerators that deliver 4.5x–13x speedups over GPUs while enabling uncertainty estimation for complex-valued neural networks.

  3. Phasor Memory Networks: Stable Backpropagation Through Time for Scalable Explicit Memory

    cs.LG 2026-05 unverdicted novelty 6.0

    PMNet uses unitary phasor dynamics and hierarchical anchors to make explicit memory stable for long sequences, matching a 3x larger Mamba model on long-context robustness with a 119M parameter network.

  4. FEDIN: Frequency-Enhanced Deep Interest Network for Click-Through Rate Prediction

    cs.IR 2026-05 unverdicted novelty 6.0

    FEDIN improves CTR prediction by using target-aware frequency filtering to isolate low-entropy periodic interest signals from high-entropy noise in user attention patterns.

  5. Magnitude Is All You Need? Rethinking Phase in Quantum Encoding of Complex SAR Data

    quant-ph 2026-04 unverdicted novelty 4.0

    Magnitude-only encoding reaches 99.57% accuracy on 3-class and 71.19% on 8-class SAR tasks in hybrid models, beating phase-inclusive alternatives, while phase boosts pure quantum models by up to 21.65 points.