pith. machine review for the scientific record. sign in

arxiv: 2605.11339 · v1 · submitted 2026-05-11 · 💻 cs.CR

Recognition: no theorem link

A Systematic Security Testing Approach for InterUSS-based environments

\'Agney Lopes Roth Ferraz, Henrique Curi de Miranda, Louren\c{c}o Alves Pereira J\'unior, Wagner Comin Sonaglio

Pith reviewed 2026-05-13 01:20 UTC · model grok-4.3

classification 💻 cs.CR
keywords security testingInterUSSunmanned traffic managementUTMmTLSOAuth 2.0UAS Service Suppliersfederated ecosystems
0
0 comments X

The pith

Deploying one InterUSS infrastructure produces a Testing Guide of security tests aligned with mTLS and OAuth 2.0 for unmanned traffic systems.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

The paper develops a security testing approach for InterUSS-based environments in unmanned aircraft traffic management. Authors deploy and analyze a working infrastructure to identify its key components, then create tests that follow established protocols such as mTLS and OAuth 2.0. These tests are collected into a single Testing Guide intended to support both validation of individual components and analysis of how components interact in federated setups. A sympathetic reader would care because federated UTM systems coordinate many independent service providers and currently lack documented, practical security checks at the infrastructure level.

Core claim

By deploying and analyzing a working InterUSS infrastructure, key components are pinpointed and specific security tests aligned with mTLS and OAuth 2.0 are developed. These tests are compiled into a Testing Guide that aids both component validation and interaction analysis across InterUSS-based ecosystems.

What carries the argument

The Testing Guide, a compiled set of security tests derived from analysis of one deployed InterUSS infrastructure, that performs component validation and interaction analysis.

If this is right

  • Maintainers obtain a repeatable checklist to validate that their USS implementations meet mTLS and OAuth 2.0 requirements.
  • Operators gain a method to examine security properties of data exchanges between separate USS instances in the same ecosystem.
  • The approach supplies concrete tests that can be applied when new components are added to an existing InterUSS deployment.
  • Research on UTM security receives a documented baseline set of tests that future work can reference or extend.

Where Pith is reading between the lines

These are editorial extensions of the paper, not claims the author makes directly.

  • The same deployment-and-test method could be applied to other federated coordination platforms that rely on similar authentication and transport protocols.
  • Periodic re-running of the guide on live systems might detect configuration drift or newly introduced vulnerabilities over time.
  • Standardization bodies for UTM could incorporate the guide's test cases into certification requirements for USS providers.

Load-bearing premise

Analyzing one deployed InterUSS infrastructure is sufficient to identify all key components and that tests based on mTLS and OAuth 2.0 comprehensively address infrastructure-level security challenges.

What would settle it

Running the guide's tests on a second, independently deployed InterUSS infrastructure and discovering a security vulnerability in a component or interaction that none of the tests flag would show the guide is incomplete.

Figures

Figures reproduced from arXiv: 2605.11339 by \'Agney Lopes Roth Ferraz, Henrique Curi de Miranda, Louren\c{c}o Alves Pereira J\'unior, Wagner Comin Sonaglio.

Figure 1
Figure 1. Figure 1: Typical InterUSS architecture 4. Testing Approach The identification of critical components and their systemic organization enables both individual and integrated analyses of environmental interactions. The investigation of vulnerabilities, coupled with the proposal of mitigation and best-practice measures, un￾derpins a rigorous testing methodology. This approach validates each component indi- [PITH_FULL_… view at source ↗
read the original abstract

Unmanned Traffic Management (UTM) federated ecosystems, such as InterUSS, enable secure coordination among UAS Service Suppliers (USSs). However, they bring up some security challenges at the infrastructure level that haven't been fully explored. This paper presents a security testing approach for InterUSS-based environments from the maintainer's perspective. By deploying and analyzing a working InterUSS infrastructure, we pinpoint key components and develop specific security tests aligned with established standards and protocols, such as mTLS and OAuth 2.0. We compiled these tests into a Testing Guide that aids both component validation and interaction analysis across InterUSS-based ecosystems, filling a gap in current research.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

3 major / 2 minor

Summary. The paper claims to address unexplored infrastructure-level security challenges in InterUSS-based UTM federated ecosystems by deploying and analyzing one working InterUSS infrastructure, identifying key components, and developing specific security tests aligned with mTLS and OAuth 2.0. These tests are compiled into a Testing Guide intended to support component validation and interaction analysis across InterUSS-based ecosystems, thereby filling a gap in current research.

Significance. If validated through empirical results and shown to generalize beyond a single deployment, the Testing Guide could offer practical value to UTM system maintainers by providing a structured approach to security testing in federated environments. However, the current lack of any reported validation, error analysis, or evidence of test effectiveness substantially reduces its potential significance as a contribution to security research.

major comments (3)
  1. [Abstract] Abstract: The central claim that the compiled tests and Testing Guide 'aid both component validation and interaction analysis' and fill a research gap is not supported by any validation results, case studies, error analysis, or empirical evidence that the tests detect the claimed security issues. This absence is load-bearing for the paper's contribution.
  2. [Methods/Deployment section] Deployment and analysis description (implied in the methods): The approach identifies key components and produces tests applicable 'across InterUSS-based ecosystems' based solely on analysis of one working infrastructure. No discussion, additional deployments, or analysis of variations in USS implementations, network topologies, certificate management, or OAuth scopes is provided to substantiate generalizability.
  3. [Introduction/Related Work] Related work or introduction: The claim of filling a gap in current research rests on an implied literature review, but no explicit comparison to prior security testing approaches in UTM or InterUSS contexts, nor details of the review process, is supplied to demonstrate novelty.
minor comments (2)
  1. The manuscript would benefit from clearer cross-references between the identified components, the developed tests, and their alignment with mTLS/OAuth 2.0 standards to improve readability.
  2. If the Testing Guide is provided as supplementary material or an appendix, it should be explicitly referenced in the main text with at least one illustrative example of test application.

Simulated Author's Rebuttal

3 responses · 0 unresolved

We thank the referee for the constructive feedback highlighting areas where the manuscript's claims require better support and contextualization. We address each major comment below and will make revisions to clarify the scope of the contribution, acknowledge limitations, and strengthen the presentation of novelty and applicability.

read point-by-point responses
  1. Referee: [Abstract] Abstract: The central claim that the compiled tests and Testing Guide 'aid both component validation and interaction analysis' and fill a research gap is not supported by any validation results, case studies, error analysis, or empirical evidence that the tests detect the claimed security issues. This absence is load-bearing for the paper's contribution.

    Authors: We acknowledge that the current manuscript does not include empirical validation, such as test execution results, error analysis, or case studies demonstrating detection of specific security issues. The core contribution is the systematic identification of components in a deployed InterUSS infrastructure and the compilation of tests aligned with mTLS and OAuth 2.0. We will revise the abstract to more precisely state that the Testing Guide offers a structured set of tests derived from this analysis and intended to support validation and interaction analysis, without claiming proven effectiveness. We will also add a limitations section outlining the need for future empirical evaluation. revision: yes

  2. Referee: [Methods/Deployment section] Deployment and analysis description (implied in the methods): The approach identifies key components and produces tests applicable 'across InterUSS-based ecosystems' based solely on analysis of one working infrastructure. No discussion, additional deployments, or analysis of variations in USS implementations, network topologies, certificate management, or OAuth scopes is provided to substantiate generalizability.

    Authors: The analysis is indeed based on a single working deployment, as described. We agree that explicit discussion of generalizability is missing. In revision, we will expand the methods and discussion sections to map the identified components and tests to the InterUSS specification, explain how the approach leverages standardized protocols to apply across ecosystems, and analyze potential variations in implementations (e.g., certificate management and OAuth scopes) with guidance on adaptation. This will provide a reasoned basis for broader applicability without additional deployments. revision: yes

  3. Referee: [Introduction/Related Work] Related work or introduction: The claim of filling a gap in current research rests on an implied literature review, but no explicit comparison to prior security testing approaches in UTM or InterUSS contexts, nor details of the review process, is supplied to demonstrate novelty.

    Authors: We will revise the related work section to include explicit comparisons with prior security testing methods in UTM and federated systems, as well as any relevant InterUSS-specific studies. We will also briefly describe the literature review process that led to identifying the infrastructure-level gap. These changes will more clearly establish the novelty of our maintainer-focused, protocol-aligned testing guide. revision: yes

Circularity Check

0 steps flagged

No circularity in derivation chain

full rationale

The paper describes an empirical methodology: deploy and analyze one InterUSS infrastructure, identify components, align tests with mTLS and OAuth 2.0 standards, and compile a Testing Guide. No equations, parameters, or derivations are present. The claim of filling a research gap is asserted as the result of this process rather than reduced to the inputs by construction, self-citation, or renaming. The single-deployment scope raises generalizability questions but does not create a circular reduction per the defined patterns.

Axiom & Free-Parameter Ledger

0 free parameters · 0 axioms · 0 invented entities

No free parameters, axioms, or invented entities are described in the abstract. The approach relies on deployment of existing infrastructure and alignment with standard protocols whose correctness is assumed from prior literature.

pith-pipeline@v0.9.0 · 5421 in / 1126 out tokens · 34473 ms · 2026-05-13T01:20:06.438302+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Reference graph

Works this paper leans on

25 extracted references · 25 canonical work pages

  1. [1]

    Ball, C. J. (2022).Hacking APIs: Breaking Web Application Programming Interfaces. No Starch Press

  2. [2]

    Burbank, J., Caleb, T., Andam, E., and Kaabouch, N. (2026). Detection and mitigation of cyber attacks on uav networks.MDPI Electronics

  3. [3]

    M., Bryan, S., O’Grady, K

    Colter, J., Kinnison, M., Henderson, A., Schlager, S. M., Bryan, S., O’Grady, K. L., Ab- balle, A., and Harbour, S. (2022). Testing the resiliency of consumer off-the-shelf drones to a variety of cyberattack methods.IEEE/AIAA Digital Avionics Systems Con- ference

  4. [4]

    O., Ma, S., and Zonouz, S

    Ding, A., Chan, M., Hass, A., Tippenhauer, N. O., Ma, S., and Zonouz, S. (2023). Get your cyber-physical tests done! data-driven vulnerability assessment of robotic aerial vehicles.Annual IEEE/IFIP International Conference on Dependable Systems and Networks. European Commission (2021). (EU) 2021/664

  5. [5]

    Fett, D., K ¨usters, R., and Schmitz, G. (2016). A comprehensive formal security analy- sis of oauth 2.0.2016 ACM SIGSAC Conference on Computer and Communications Security

  6. [6]

    Fouda, R. M. (2018). Security vulnerabilities of cyberphysical unmanned aircraft systems. IEEE Aerospace and Electronic Systems Magazine

  7. [7]

    Gabrielsson, J., Bugeja, J., and V ogel, B. (2021). Hacking a commercial drone with open-source software: Exploring data privacy violations.Mediterranean Conference on Embedded Computing

  8. [8]

    Guo, R., Wang, B., and Weng, J. (2020). Vulnerabilities and attacks of uav cyber physical systems.International Conference on Computing, Networks and Internet of Things

  9. [9]

    Hamissi, A., Dhraief, A., and Sliman, L. (2025). A review on safety and security in decentralized extensible traffic management systems.SN Computer Science. IETF (2015). RFC 7519 - JSON Web Token (JWT). https://datatracker.ietf.org/doc/html/rfc7519. [Accessed at: May, 2026]. IETF (2020). RFC 8705 - OAuth 2.0 Mutual-TLS Client Authentication and Certificat...

  10. [10]

    Kang, H., Liu, G., Wang, Q., Meng, L., and Liu, J. (2023). Theory and application of zero trust security: A brief survey.MDPI Entropy

  11. [11]

    Karmakar, G., Petty, M., Ahmed, H., Das, R., and Kamruzzaman, J. (2022). Security of internet of things devices: Ethical hacking a drone and its mitigation strategies.IEEE Asia-Pacific Conference on Computer Science and Data Engineering

  12. [12]

    Kong, P.-Y . (2021). A survey of cyberattack countermeasures for unmanned aerial vehi- cles.IEEE Access

  13. [13]

    and Mohanty, S

    Kumar, C. and Mohanty, S. (2021). Current trends in cyber security for drones.Interna- tional Carnahan Conference on Security Technology

  14. [14]

    Marchetti, E., Waheed, T., and Calabr `o, A. (2024). Cybersecurity testing in drones do- main: A systematic literature review.IEEE Access

  15. [15]

    Nouacer, R., Hussein, M., Detterer, P., Villar, E., Herrera, F., Tieri, C., and Grolleau, E. (2023). Towards a european network of enabling technologies for drones.DroneSE and RAPIDO: System Engineering for constrained embedded systems

  16. [16]

    and Mahalal, E

    Oli, A. and Mahalal, E. (2025). Uav security: Attacks, defenses, and open challenges. IEEE Access

  17. [17]

    and Ousterhout, J

    Ongaro, D. and Ousterhout, J. (2014). In search of an understandable consensus algo- rithm.2014 USENIX Annual Technical Conference. OW ASP (2020). OW ASP web security testing guide v4.2. https://owasp.org/www- project-web-security-testing-guide/. [Accessed at: May, 2026]

  18. [18]

    Platform, I. (2026). InterUSS monitoring v0.28.0. https://github.com/interuss/monitoring. [Accessed at: May, 2026]

  19. [19]

    and Hayajneh, T

    Restituyo, R. and Hayajneh, T. (2018). Vulnerabilities and attacks analysis for military and commercial iot drones.IEEE Annual Ubiquitous Computing, Electronics & Mobile Communication Conference

  20. [20]

    E., Karabiyik, U., Rogers, M

    Salamh, F. E., Karabiyik, U., Rogers, M. K., and Matson, E. T. (2021). Unmanned aerial vehicle kill chain: Purple teaming tactics.IEEE Annual Computing and Communica- tion Workshop and Conference

  21. [21]

    and Kopardekar, P

    Sampigethaya, K. and Kopardekar, P. (2018). Cyber security of unmanned aircraft sys- tem traffic management (utm).Integrated Communications, Navigation, Surveillance Conference (ICNS)

  22. [22]

    and Kaur, H

    Sanghavi, P. and Kaur, H. (2023). A comprehensive study on cyber security in unmanned aerial vehicles.International Conference on Computing for Sustainable Global Devel- opment

  23. [23]

    and Pinto, M

    Stuttard, D. and Pinto, M. (2011).The Web Application Hackers Handbook 2nd Edition. Wiley

  24. [24]

    S., Keong, P

    Veerappan, C. S., Keong, P. L. K., Balachandran, V ., and Fadilah, M. S. B. M. (2021). Drat: A penetration testing framework for drones.IEEE Conference on Industrial Electronics and Applications

  25. [25]

    Wu, S., Li, Y ., Wang, Z., Tan, Z., and Pan, Q. (2023). A highly interpretable framework for generic low-cost uav attack detection.IEEE Sensors Journal