Recognition: unknown
FlexiTac: A Low-Cost, Open-Source, Scalable Tactile Sensing Solution for Robotic Systems
Pith reviewed 2026-05-07 06:51 UTC · model grok-4.3
The pith
A sealed three-layer laminate delivers affordable, repeatable tactile sensing to robotic grippers.
A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.
Core claim
FlexiTac consists of flexible tactile sensor pads using a sealed three-layer laminate stack of FPC-Velostat-FPC with electrode patterns integrated into the flexible printed circuits, paired with a compact multi-channel readout board that streams synchronized measurements at 100 Hz via serial communication. This design improves fabrication and repeatability, maintains compliance for deployment on diverse grippers, and supports tactile learning pipelines including 3D visuo-tactile fusion, cross-embodiment skill transfer, and real-to-sim-to-real fine-tuning with GPU-parallel simulation.
What carries the argument
The sealed three-layer laminate stack (FPC-Velostat-FPC) with electrode patterns directly integrated into flexible printed circuits, which enables improved fabrication throughput, repeatability, and mechanical compliance for tactile sensing on robotic systems.
Load-bearing premise
That the three-layer laminate and low-cost readout electronics will deliver dense, synchronized, and repeatable tactile signals across varied mounting configurations and real-world use without requiring extensive per-unit calibration or suffering from mechanical or electrical drift.
What would settle it
Finding that fabricated sensor units show large variations in sensitivity or that signals drift significantly after repeated mounting and flexing on a gripper, or that synchronization cannot be maintained at 100 Hz with multiple connected pads.
Figures
read the original abstract
We present FlexiTac, a low-cost, open-source, and scalable piezoresistive tactile sensing solution designed for robotic end-effectors. FlexiTac is a practical "plug-in" module consisting of (i) thin, flexible tactile sensor pads that provide dense tactile signals and (ii) a compact multi-channel readout board that streams synchronized measurements for real-time control and large-scale data collection. FlexiTac pads adopt a sealed three-layer laminate stack (FPC-Velostat-FPC) with electrode patterns directly integrated into flexible printed circuits, substantially improving fabrication throughput and repeatability while maintaining mechanical compliance for deployment on both rigid and soft grippers. The readout electronics use widely available, low-cost components and stream tactile signals to a host computer at 100 Hz via serial communication. Across multiple configurations, including fingertip pads and larger tactile mats, FlexiTac can be mounted on diverse platforms without major mechanical redesign. We further show that FlexiTac supports modern tactile learning pipelines, including 3D visuo-tactile fusion for contact-aware decision making, cross-embodiment skill transfer, and real-to-sim-to-real fine-tuning with GPU-parallel tactile simulation. Our project page is available at https://flexitac.github.io/.
Editorial analysis
A structured set of objections, weighed in public.
Referee Report
Summary. The paper presents FlexiTac, a low-cost, open-source tactile sensing system for robotic end-effectors consisting of thin flexible sensor pads based on a sealed three-layer FPC-Velostat-FPC laminate with integrated electrode patterns and a compact multi-channel readout board that streams synchronized tactile signals at 100 Hz via serial communication. The design is claimed to improve fabrication throughput and repeatability while preserving mechanical compliance for mounting on both rigid and soft grippers. The work further asserts compatibility with modern tactile learning pipelines, including 3D visuo-tactile fusion for contact-aware decisions, cross-embodiment skill transfer, and real-to-sim-to-real fine-tuning.
Significance. If the performance and integration claims are substantiated, FlexiTac could lower barriers to tactile sensing in robotics by providing an accessible, scalable hardware platform with open-source components and simulation support. The emphasis on mechanical compliance across gripper types and real-time streaming at 100 Hz addresses practical needs in manipulation research. The open-source release and explicit support for visuo-tactile and sim-to-real pipelines are clear strengths that could accelerate community adoption.
major comments (2)
- [§4 (Sensor Design)] §4 (Sensor Design): The central claim that the sealed three-layer FPC-Velostat-FPC laminate 'substantially improving fabrication throughput and repeatability' while mitigating known Velostat issues (hysteresis, temperature sensitivity, fatigue) is not supported by any quantitative characterization data such as multi-cycle resistance-pressure curves, drift measurements over time, or temperature variation tests. This directly bears on the 'plug-and-play' and scalability assertions.
- [§6 (Experiments and Integration)] §6 (Experiments and Integration): The demonstrations of support for tactile learning pipelines (visuo-tactile fusion, cross-embodiment transfer, real-to-sim-to-real) are described at a high level with no reported metrics, number of trials, success rates, error bars, or comparisons to baseline sensors. Without such data, it is not possible to evaluate whether the 100 Hz synchronized streaming delivers the dense, repeatable signals required for the claimed applications.
minor comments (2)
- The manuscript references a project page but does not include an explicit statement on the availability of CAD files, PCB layouts, bill of materials, or firmware code within the text itself.
- Figure captions for the laminate stack and readout board could more explicitly label the three-layer construction and electrode patterns to aid readers in understanding the fabrication improvements.
Simulated Author's Rebuttal
We thank the referee for their thorough review and constructive comments. We address each major comment below and indicate the revisions we will make to strengthen the manuscript.
read point-by-point responses
-
Referee: §4 (Sensor Design): The central claim that the sealed three-layer FPC-Velostat-FPC laminate 'substantially improving fabrication throughput and repeatability' while mitigating known Velostat issues (hysteresis, temperature sensitivity, fatigue) is not supported by any quantitative characterization data such as multi-cycle resistance-pressure curves, drift measurements over time, or temperature variation tests. This directly bears on the 'plug-and-play' and scalability assertions.
Authors: We acknowledge that the manuscript lacks quantitative characterization of the sensor's performance metrics such as hysteresis, drift, and temperature sensitivity. While the design choices are intended to address these issues through the sealed laminate structure, we agree that empirical data is necessary to substantiate the claims of improved fabrication throughput, repeatability, and mitigation of Velostat drawbacks. In the revised manuscript, we will include results from multi-cycle resistance-pressure tests, long-term drift measurements, and temperature variation experiments to provide the required quantitative support for the scalability and plug-and-play assertions. revision: yes
-
Referee: §6 (Experiments and Integration): The demonstrations of support for tactile learning pipelines (visuo-tactile fusion, cross-embodiment transfer, real-to-sim-to-real) are described at a high level with no reported metrics, number of trials, success rates, error bars, or comparisons to baseline sensors. Without such data, it is not possible to evaluate whether the 100 Hz synchronized streaming delivers the dense, repeatable signals required for the claimed applications.
Authors: We recognize that the current description of the integration experiments is high-level and lacks specific quantitative metrics. The manuscript focuses on demonstrating the feasibility of using FlexiTac in various learning pipelines, but to allow proper evaluation of the sensor's performance in these contexts, we will expand this section in the revision. This will include details on the number of trials, success rates, error bars, and where possible, comparisons to other sensors or baselines, along with confirmation that the 100 Hz streaming provides sufficient data density and repeatability for the applications. revision: yes
Circularity Check
No circularity: hardware design paper with no derivations or fitted predictions
full rationale
The paper describes a physical tactile sensor (FPC-Velostat-FPC laminate, readout board, 100 Hz streaming) and its fabrication/usage for robotic applications. No equations, parameter fittings, uniqueness theorems, or self-referential derivations appear in the abstract or described content. Performance claims rest on engineering choices and empirical use cases (visuo-tactile fusion, sim-to-real), not on any step that reduces to its own inputs by construction. This matches the reader's 0.0 assessment; the skeptic concerns address empirical robustness rather than logical circularity.
Axiom & Free-Parameter Ledger
Reference graph
Works this paper leans on
-
[1]
P. Jenmalm and R. S. Johansson. Visual and somatosensory information about object shape control manipulative fingertip forces.Journal of Neuroscience, 17(11):4486–4499, 1997. doi:10.1523/JNEUROSCI.17-11-04486.1997
-
[2]
R. S. Johansson and G. Westling. Roles of glabrous skin receptors and sensorimotor memory in automatic control of precision grip when lifting rougher or more slippery objects.Experimental Brain Research, 56:550–564, 1984
1984
-
[3]
Calandra, A
R. Calandra, A. Owens, M. Upadhyaya, W. Yuan, J. Lin, E. H. Adelson, and S. Levine. The feeling of success: Does touch sensing help predict grasp outcomes? InProceedings of the Conference on Robot Learning, 2017
2017
-
[4]
R. Calandra, A. Owens, D. Jayaraman, J. Lin, W. Yuan, J. Malik, E. H. Adelson, and S. Levine. More than a feeling: Learning to grasp and regrasp using vision and touch.IEEE Robotics and Automation Letters, 3(4):3300–3307, 2018. doi:10.1109/LRA.2018.2852779
-
[5]
M. A. Lee, Y . Zhu, K. Srinivasan, P. Shah, S. Savarese, L. Fei-Fei, A. Garg, and J. Bohg. Making sense of vision and touch: Self-supervised learning of multimodal representations for contact-rich tasks. InProceedings of the IEEE International Conference on Robotics and Automation, pages 8943–8950, 2019. doi:10.1109/ICRA.2019.8793485
-
[6]
Huang, Y
B. Huang, Y . Wang, X. Yang, Y . Luo, and Y . Li. 3d-vitac: Learning fine-grained manipulation with visuo-tactile sensing. In8th Annual Conference on Robot Learning, 2024
2024
-
[7]
Sunil, S
N. Sunil, S. Wang, Y . She, E. Adelson, and A. Rodriguez. Visuotactile affordances for cloth manipulation with local control. InConference on Robot Learning, pages 1596–1606. PMLR, 2023
2023
-
[8]
H. Chen, J. Xu, H. Chen, K. Hong, B. Huang, C. Liu, J. Mao, Y . Li, Y . Du, and K. Driggs- Campbell. Multi-modal manipulation via multi-modal policy consensus.arXiv preprint arXiv:2509.23468, 2025
work page internal anchor Pith review Pith/arXiv arXiv 2025
-
[9]
W. Yuan, S. Dong, and E. H. Adelson. Gelsight: High-resolution robot tactile sensors for estimating geometry and force.Sensors, 17(12):2762, 2017. doi:10.3390/s17122762
-
[10]
M. Lambeta, P.-W. Chou, S. Tian, B. Yang, B. Maloon, V . R. Most, D. Stroud, R. Santos, A. Byagowi, G. Kammerer, D. Jayaraman, and R. Calandra. Digit: A novel design for a low-cost compact high-resolution tactile sensor with application to in-hand manipulation.IEEE Robotics and Automation Letters, 5(3):3838–3845, 2020. doi:10.1109/LRA.2020.2977257
-
[11]
B. Ward-Cherrier, N. Pestell, L. Cramphorn, B. Winstone, M. E. Giannaccini, J. Rossiter, and N. F. Lepora. The tactip family: Soft optical tactile sensors with 3d-printed biomimetic morphologies.Soft Robotics, 5(2):216–227, 2018. doi:10.1089/soro.2017.0052
- [12]
-
[13]
Padmanabha, F
A. Padmanabha, F. Ebert, S. Tian, R. Calandra, C. Finn, and S. Levine. Omnitact: A multi- directional high-resolution touch sensor. In2020 IEEE International Conference on Robotics and Automation (ICRA), pages 618–624. IEEE, 2020
2020
-
[14]
I. H. Taylor, S. Dong, and A. Rodriguez. Gelslim 3.0: High-resolution measurement of shape, force and slip in a compact tactile-sensing finger. In2022 International Conference on Robotics and Automation (ICRA), pages 10781–10787. IEEE, 2022
2022
-
[15]
B ¨uscher, M
G. B ¨uscher, M. Meier, G. Walck, R. Haschke, and H. J. Ritter. Augmenting curved robot surfaces with soft tactile skin. In2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pages 1514–1519. IEEE, 2015. 9
2015
-
[16]
Bhirangi, T
R. Bhirangi, T. Hellebrekers, C. Majidi, and A. Gupta. Reskin: Versatile, replaceable, lasting tactile skins. InProceedings of the Conference on Robot Learning, 2021
2021
-
[17]
R. Bhirangi, V . Pattabiraman, E. Erciyes, Y . Cao, T. Hellebrekers, and L. Pinto. Anyskin: Plug-and-play skin sensing for robotic touch.arXiv preprint arXiv:2409.08276, 2024
-
[18]
V . Pattabiraman, Z. Huang, D. Panozzo, D. Zorin, L. Pinto, and R. Bhirangi. eflesh: Highly customizable magnetic touch sensing using cut-cell microstructures.arXiv preprint arXiv:2506.09994, 2025
-
[19]
T. P. Tomo, M. Regoli, A. Schmitz, L. Natale, H. Kristanto, S. Somlor, L. Jamone, G. Metta, and S. Sugano. A new silicone structure for uskin—a soft, distributed, digital 3-axis skin sensor and its integration on the humanoid robot icub.IEEE Robotics and Automation Letters, 3(3): 2584–2591, 2018
2018
-
[20]
R. Bhirangi, A. DeFranco, J. Adkins, C. Majidi, A. Gupta, T. Hellebrekers, and V . Kumar. All the feels: A dexterous hand with large area sensing.arXiv preprint arXiv:2210.15658, 2022
-
[21]
S. Wistreich, B. Shi, S. Tian, S. Clarke, M. Nath, C. Xu, Z. Bao, and J. Wu. Dexskin: High- coverage conformable robotic skin for learning contact-rich manipulation. InConference on Robot Learning, 2025. URLhttps://arxiv.org/abs/2509.18830
-
[22]
Y . Luo, Y . Li, P. Sharma, W. Shou, K. Wu, M. Foshey, B. Li, T. Palacios, A. Torralba, and W. Matusik. Learning human–environment interactions using conformal tactile textiles.Nature Electronics, 4(3):193–201, 2021. doi:10.1038/s41928-021-00558-0
-
[23]
The design of stretch: A compact, lightweight mobile manipulator for indoor human environments,
L. Zlokapa, Y . Luo, J. Xu, M. Foshey, K. Wu, P. Agrawal, and W. Matusik. An integrated design pipeline for tactile sensing robotic manipulators. InProceedings of the IEEE International Conference on Robotics and Automation, pages 3137–3144, 2022. doi:10.1109/ICRA46639. 2022.9812335
-
[24]
Proceedings of the Extended Abstracts of the CHI Conference on Human Factors in Computing Systems
D. Murphy, Y . Li, C. E. Owens, L. Stanton, P. P. Liang, Y . Luo, A. Torralba, and W. Matusik. Fits like a flex-glove: Automatic design of personalized fpcb-based tactile sensing gloves. In Proceedings of the Extended Abstracts of the CHI Conference on Human Factors in Computing Systems, CHI EA ’25. Association for Computing Machinery, 2025. doi:10.1145/3...
-
[25]
S. Sundaram, P. Kellnhofer, Y . Li, J.-Y . Zhu, A. Torralba, and W. Matusik. Learning the signatures of the human grasp using a scalable tactile glove.Nature, 569(7758):698–702, 2019. doi:10.1038/s41586-019-1234-z
- [26]
- [27]
-
[28]
Falco, S
P. Falco, S. Lu, A. Cirillo, C. Natale, S. Pirozzi, and D. Lee. Cross-modal visuo-tactile object recognition using robotic active exploration. In2017 IEEE International Conference on Robotics and Automation (ICRA), pages 5273–5280. IEEE, 2017
2017
-
[29]
Guzey, B
I. Guzey, B. Evans, S. Chintala, and L. Pinto. Dexterity from touch: Self-supervised pre-training of tactile representations with robotic play, 2023
2023
- [30]
- [31]
- [32]
-
[33]
N. Tao, Y . He, W. Maa, B. Huang, and Y . Li. LeFlexiTac: Giving robots a sense of touch.Columbia University RoboPIL Blog, 2026. https://tna001-ai.github.io/ tactile-lerobot-website/
2026
-
[34]
van Hoof, N
H. van Hoof, N. Chen, M. Karl, P. van der Smagt, and J. Peters. Stable reinforcement learning with autoencoders for tactile and visual data. In2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pages 3928–3934. IEEE, 2016
2016
-
[35]
Hansen, F
J. Hansen, F. Hogan, D. Rivkin, D. Meger, M. Jenkin, and G. Dudek. Visuotactile-rl: Learning multimodal manipulation policies with deep reinforcement learning. In2022 International Conference on Robotics and Automation (ICRA), pages 8298–8304. IEEE, 2022
2022
-
[36]
Y . Chen, M. Van der Merwe, A. Sipos, and N. Fazeli. Visuo-tactile transformers for manipulation. In6th Annual Conference on Robot Learning, 2022
2022
-
[37]
Y . Qin, B. Huang, Z.-H. Yin, H. Su, and X. Wang. Dexpoint: Generalizable point cloud reinforcement learning for sim-to-real dexterous manipulation.Conference on Robot Learning (CoRL), 2022
2022
-
[38]
C. Chi, Z. Xu, C. Pan, E. Cousineau, B. Burchfiel, S. Feng, R. Tedrake, and S. Song. Universal manipulation interface: In-the-wild robot teaching without in-the-wild robots.arXiv preprint arXiv:2402.10329, 2024
work page internal anchor Pith review arXiv 2024
- [39]
-
[40]
X. Zhu, B. Huang, and Y . Li. Touch in the wild: Learning fine-grained manipulation with a portable visuo-tactile gripper. InThe Thirty-ninth Annual Conference on Neural Information Processing Systems, 2025. URLhttps://openreview.net/forum?id=WabVVQKTUF
2025
- [41]
-
[42]
Y . S. Narang, B. Sundaralingam, M. Macklin, A. Mousavian, and D. Fox. Sim-to-real for robotic tactile sensing via physics-based simulation and learned latent projections. In2021 IEEE International Conference on Robotics and Automation (ICRA), pages 6444–6451. IEEE, 2021
2021
-
[43]
T. Bi, C. Sferrazza, and R. D’Andrea. Zero-shot sim-to-real transfer of tactile control policies for aggressive swing-up manipulation.IEEE Robotics and Automation Letters, 6(3):5761–5768,
-
[44]
doi:10.1109/LRA.2021.3084889
-
[45]
Church, J
A. Church, J. Lloyd, R. Hadsell, and N. F. Lepora. Tactile sim-to-real policy transfer via real-to-sim image translation. InProceedings of the Conference on Robot Learning, 2022
2022
-
[46]
E. Su, C. Jia, Y . Qin, W. Zhou, A. Macaluso, B. Huang, and X. Wang. Sim2real manip- ulation on unknown objects with tactile-based reinforcement learning. In2024 IEEE In- ternational Conference on Robotics and Automation (ICRA), pages 9234–9241, 2024. doi: 10.1109/ICRA57147.2024.10611113
- [47]
-
[48]
T. Pang, H. J. T. Suh, L. Yang, and R. Tedrake. Global planning for contact-rich manipulation via local smoothing of quasi-dynamic contact models, 2022
2022
-
[49]
Oller, D
M. Oller, D. Berenson, and N. Fazeli. Tactilevad: Geometric aliasing-aware dynamics for high-resolution tactile control. InProceedings of The 7th Conference on Robot Learning, volume 229 ofProceedings of Machine Learning Research, pages 3083–3099. PMLR, 2023
2023
-
[50]
Y . Zhou, W. S. Lee, Y . Gu, and Y . She. Tactile-reactive gripper with an active palm for dexterous manipulation.npj Robotics, 4(1):13, 2026
2026
-
[51]
Huang, J
B. Huang, J. Xu, I. Akinola, W. Yang, B. Sundaralingam, R. O’Flaherty, D. Fox, X. Wang, A. Mousavian, Y .-W. Chao, and Y . Li. VT-refine: Learning bimanual assembly with visuo-tactile feedback via simulation fine-tuning. In9th Annual Conference on Robot Learning, 2025. URL https://openreview.net/forum?id=bOVF8Rj33i. 12
2025
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.