Research during Ph.D.
Research focuses on
embodied AI,
robotic in-hand manipulation,
foundation model,
visuotactile representations, control theory, and multi-modal robotic systems.
* indicates equal contribution.
|
|
ManiFeel: Benchmarking and Understanding Visuotactile Manipulation Policy Learning
Quan Khanh Luu*, Pokuang Zhou*, Zhengtong Xu*, Zhiyuan Zhang, Qiang Qiu, Yu She
arXiv, 2025, New England Manipulation Symposium   (Oral)
project page
/
arXiv
Built an open-source platform for vision- and tactile-based robotic task execution in both Isaac Gym and the real world, enabling benchmarking of state-of-the-art machine learning methods.
|
|
In-Hand Singulation, Scooping, and Cable Untangling with a 5-Dof Tactile-Reactive Gripper
Yuhao Zhou*, Pokuang Zhou*, Shaoxiong Wang, Yu She
ADRR, 2025   (Cover Feature)
[Journal] Advanced Robot Research
project page
/
paper
We developed a custom 5-DOF gripper that integrates both hardware design and control algorithms, enabling in-hand singulation, scooping, and cable untangling.
|
|
Safe Human-Robot Collaboration With Risk Tunable Control Barrier Functions
Vipul K Sharma*, Pokuang Zhou*, Zhengtong Xu*, Yu She, S Sivaranjani
T-MECH and AIM, 2025   (Best Student Paper Nomination)
[Journal] IEEE/ASME Transactions on Mechatronics and [Conference] IEEE/ASME International Conference on Advanced Intelligent Mechatronics
video
/
paper
A Control Barrier Function based framework for Safe Human-Robot Collaboration that enables adjustable risk modulation to trade off between safety and task efficiency.
|
|
DogTac: Visuotactile-Based Whole-Body Control of a Quadrupedal Robot with Manipulator
Pokuang Zhou, Joseph Campbell, Yu She
under review, 2025
[Journal]
Developed a reinforcement-learning whole-body controller with visuotactile fusion for a quadruped-arm, enabling diverse loco-manipulation tasks; trained in NVIDIA Isaac Gym and successfully transferred to real-world deployment via sim2real.
|
|
DartBot: Overhand Throwing of Deformable Objects with Tactile Sensing and Reinforcement Learning
Shoaib Aslam*, Krish Kumar*,Pokuang Zhou*, Hongyu Yu, Michael Wang, Yu She
T-ASE and CASE, 2025
[Journal] IEEE Transactions on Automation Science and Engineering and [Conference] IEEE International Conference on Automation Science and Engineering
video
/
paper
Designed a vision- and tactile-based reinforcement learning system enabling a UR5 robot to throw deformable darts overhand and accurately hit the target.
|
|
TacScope: A Miniaturized Vision-based Tactile Sensor for Surgical Applications
Md Rakibul Islam Prince, Sheeraz Athar, Pokuang Zhou, Yu She
ADRR, 2025
[Journal] Advanced Robot Research, accepted, adrr.202500117r1
paper (coming soon)
TacScope is a compact, low-cost vision-based tactile sensor with a spherical elastomer that, after single-image calibration, captures high-resolution 3D contact geometry.
|
|
Stick Roller: Precise In-hand Stick Rolling with a Sample-Efficient Tactile Model
Yipai Du, Pokuang Zhou, Michael Yu Wang, Wenzhao Lian, Yu She
IROS, 2024
[Conference] IEEE/RSJ International Conference on Intelligent Robots and Systems
video
/
paper
StickRoller achieves precise in-hand stick repositioning through a sample-efficient tactile model. It enables the stick to be rolled to the center of the fingers using only a few well-planned two-finger manipulations.
|
|
In-Hand Singulation and Scooping Manipulation with a 5 DOF Tactile Gripper
Yuhao Zhou*, Pokuang Zhou*, Shaoxiong Wang, Yu She
IROS, 2024
[Conference] IEEE/RSJ International Conference on Intelligent Robots and Systems
project page
/
paper
A compact gripper with integrated GelSight tactile sensing that simplifies dexterous manipulation, demonstrating high success on granular singulation/classification and precise credit-card scooping/insertion tasks.
|
|
Robotic System with Tactile-Enabled High-Resolution Hyperspectral Imaging Device for Autonomous Corn Leaf Phenotyping in Controlled Environments
Xuan Li, Ziling Chen, Raghava Sai Uppuluri, Pokuang Zhou, Tianzhang Zhao, Darrell Zachary Good, Yu She, Jian Jin
ATECH, 2025
[Journal] Smart Agricultural Technology
paper
An autonomous robotic system that uses SAM-guided RGB-D perception, in-hand leaf manipulation, and a line-scan hyperspectral camera with a vision-based tactile tracker to scan individual corn leaves at high resolution.
|
Feel free to steal this website's source code. Do not scrape the HTML from this page itself, as it includes analytics tags that you do not want on your own website — use the github code instead. Also, consider using Leonid Keselman's Jekyll fork of this page.
|
|