pith. machine review for the scientific record. sign in

arxiv: 1412.3474 · v1 · submitted 2014-12-10 · 💻 cs.CV

Recognition: unknown

Deep Domain Confusion: Maximizing for Domain Invariance

Authors on Pith no claims yet
classification 💻 cs.CV
keywords domainadaptationconfusiondeeplayerarchitecturebenchmarkdataset
0
0 comments X
read the original abstract

Recent reports suggest that a generic supervised deep CNN model trained on a large-scale dataset reduces, but does not remove, dataset bias on a standard benchmark. Fine-tuning deep models in a new domain can require a significant amount of data, which for many applications is simply not available. We propose a new CNN architecture which introduces an adaptation layer and an additional domain confusion loss, to learn a representation that is both semantically meaningful and domain invariant. We additionally show that a domain confusion metric can be used for model selection to determine the dimension of an adaptation layer and the best position for the layer in the CNN architecture. Our proposed adaptation method offers empirical performance which exceeds previously published results on a standard benchmark visual domain adaptation task.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 7 Pith papers

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. A General Representation-Based Approach to Multi-Source Domain Adaptation

    cs.LG 2026-04 unverdicted novelty 7.0

    A representation learning approach for multi-source domain adaptation achieves identifiability by partitioning the label's Markov blanket into parents, children, and spouses.

  2. Adaptive Data Compression and Reconstruction for Memory-Bounded EEG Continual Learning

    cs.LG 2026-05 unverdicted novelty 6.0

    ADaCoRe enables memory-bounded UICL for EEG by compressing and reconstructing signals while preserving key morphologies, outperforming baselines with gains of at least +2.7 and +15.3 ACC on ISRUC and FACED datasets.

  3. Generalized Category Discovery under Domain Shifts: From Vision to Vision-Language Models

    cs.CV 2026-04 unverdicted novelty 6.0

    Three frameworks adapt foundation models for generalized category discovery under domain shifts via disentanglement and prompt tuning, showing gains on synthetic and real multi-domain data.

  4. PAS: Estimating the target accuracy before domain adaptation

    cs.CV 2026-04 unverdicted novelty 6.0

    PAS estimates target accuracy for domain adaptation by measuring compatibility between source domains, pre-trained feature extractors, and target tasks using embeddings, correlating strongly with actual post-adaptatio...

  5. Tunable Domain Adaptation Using Unfolding

    cs.LG 2026-03 unverdicted novelty 6.0

    Two tunable domain adaptation methods using unrolled networks achieve improved or comparable performance to domain-specific models on compressed sensing regression tasks.

  6. Effective Knowledge Transfer for Multi-Task Recommendation Models

    cs.IR 2026-05 unverdicted novelty 5.0

    EKTM introduces router, transmitter, and enhanced modules to transfer knowledge across multi-task CVR models, outperforming prior methods with a 3.93% eCPM uplift in online A/B tests on a commercial platform.

  7. A Robust Unsupervised Domain Adaptation Framework for Medical Image Classification Using RKHS-MMD

    cs.CV 2026-05 unverdicted novelty 3.0

    An RKHS-MMD domain adaptation method yields better chest X-ray classification across medical centers than non-adapted models by aligning source and target feature distributions.