Incubator Speech
Table of Contents
Incubator Presentation Summary⌗
Good morning, everyone.⌗
I’m Aditya Rawat and he is Jensilin. We are here to present the summary of our incubator and share what we’ve learned from it.
Name of Project⌗
So the name of our project is Wheeled and arm Robotic simulation.
Project Overview⌗
But before giving you the project overview let’s understand the problem statement for understanding the project better
Let’s assume we have to send a robot rover to the moon surface to get some samples, without knowing about the terrain how can we train our rover for sample collection?
That’s where our project comes into picture.
Our project is focused on training simulations models using NVIDIA Omniverse, specifically Isaac Sim platform. The goal is to bridge the gap between simulation and the real world (sim-to-real), by training models on different conditions.
We’ve developed and tested two key use cases:
- Drift Simulation
- S0-100 Arm Pick
🌀 Our First Use case – Drift Simulation⌗
The objective of this project was to simulate realistic drifting behavior of a wheeled robot across different surface and steering conditions.
🧠 To explain this in simple terms:⌗
Imagine you went to Ladakh on a car, in that case scenario you might be driving on various surfaces like sand, ice, or concrete.
Each surface affects how the car turns or skids.
Similarly, our simulation replicates how a robot might behave under such varying conditions, helping us understand and train it better.
This simulation was built using the Wheeled Lab package available inside Isaac Sim.
Important Note:
The primary goal of the incubator was not 3D model development.
Instead, it was to generate synthetic data by training models to perform specific actions across different environments.
🔧 Technical Details:⌗
- 3D models of the wheeled robots were imported in
.urdf
format. - The environment or scene was imported into the simulation using the
.usd
format.
🤖 The Second use case – SO-100 Arm⌗
The second use case focuses on simulating the SO-100 robotic arm to perform a basic pick-up task in a virtual physics enabled simulation.
🎯 Objective:⌗
To have the robot identify, articulate to pick up a cube using its grasping logic, simulating how it would perform in a real-world industrial setting.
🔑 Features:⌗
- Physics-enabled articulated robotic arm.
- Collision-aware contact sensors.
- Grasping implemented via joint control and rigid body simulation.
#END
Questions⌗
Isaac lab?⌗
Specifically Only for robot learning.
Isaac sim?⌗
Robot learning can be done in it but it also being used for more things then robots.
How your incubator is different from other physical robot pickup arm?⌗
- Our project is made in simulation.
- Less const
- Efficient Development Cycle
- Multiple virtual Environments.
How you implemented RL in your Incubator?⌗
RL is in-built inside Isaac sim.
Where are we using python script in our Incubator.⌗
You can control robots, add sensors, move objects, and simulate environments using Python arguments.
Why are we using physicX ?⌗
Used for accelerating the GPU.
Why the car needs to be drift?⌗
we are simulating drifting to study complex vehicle dynamics like traction loss, yaw control, and how cars behave under different environment and steering conditions.