Hi, I’m Chinmay Amrutkar
Engineering Intelligent Machines
Robotics • Computer Vision • AI
A Robotics Engineer crafting perception and control systems for autonomous agents. Currently pursuing Masters in Robotics and Autonomous Systems in Artificial Intelligence at ASU, passionate about bridging AI theory with real-world robotic applications.
About Me
Driven by a fascination for how machines perceive and interact with the physical world, I specialize in the intersection of Computer Vision, Artificial Intelligence, and Robotics.
My academic foundation includes a Gold Medalist B.Tech in Robotics & Automation from MIT World Peace University and ongoing M.S. studies in Robotics & Autonomous Systems (AI) at Arizona State University.
From developing perception pipelines for Unmanned Surface Vehicles using YOLOv8 to creating high-fidelity digital twins for robotic arms like the MyCobot, I enjoy building robust, efficient systems that solve tangible problems. I thrive on collaboration and the challenge of translating complex algorithms into practical, intelligent automation.
"My goal is to engineer systems that not only perceive the world intelligently but also act upon it effectively and safely."
Skills & Technologies
Featured Projects
3D Reconstruction using NeRF
Captured a 55-image 360° dataset of a real-world meteorite and performed pose estimation using Agisoft Metashape; converted outputs to NeRF-compatible formats via custom scripts and compared Instant-NGP, Nerfacto, and TensoRF on training time vs. reconstruction fidelity.
3D Object Detection for Autonomous Driving
Compared voxel-based and BEV-based paradigms on KITTI; proposed a CBAM-FPN-ResNet18 backbone to improve detection of occluded and small-scale objects using sparse LiDAR data while maintaining real-time performance.
Context Engineering for Multimodal VLM Agents
Architected an agentic framework using Gemini 2.5 Pro to solve long-horizon planning tasks in video games. Features embedding-based loop detection (Sentence-Transformers), semantic action reformulation, and a Memento-style memory system for robust autonomous navigation.
TinyML: Real-Time Sensor-Agnostic Posture Detection
Developed an end-to-end embedded ML pipeline on Arduino Nano 33 BLE. Deployed a quantized 1D-CNN using TFLite Micro to detect postures with 99.12% accuracy, robust to sensor type (Accel/Gyro/Mag) and orientation.
TinyML Audio Classifier for Absolutist Language Detection
Engineered a real-time keyword spotting system on Arduino Nano 33 BLE Sense using a quantized DS-CNN (93% accuracy). Implemented a full ML pipeline with TFLite Micro to detect mental health linguistic markers in continuous audio streams.
InnovationHacks 2025: Pitch Perfect
Built a Gradio-based NLP tool in 24 hours using whisper.cpp, Vander, and Olama to transcribe interview videos, analyze sentiment and relevance, and deliver real-time feedback for job seekers.
Line-Following & Landing Drone (Simulink)
Fully onboard vision-guided control for the Parrot Mambo minidrone using Simulink and Stateflow. Includes autonomous takeoff, line following via image thresholding, and landing logic triggered by path end detection.
Maze-Solving Robot with Digital Twin
Autonomous 4x4 maze navigation using MyCobot Pro 600 and its MATLAB-based digital twin. Combines OpenCV vision, inverse kinematics, and real-time TCP control for accurate motion execution.
Cart-Pole Optimal Control (LQR)
Tuning and analysis of an LQR controller for cart-pole stability under simulated earthquake disturbances using ROS2.
Vision-Based Drone Landing (Parrot Mambo)
Autonomous landing on a moving platform using onboard vision (HSV segmentation, blob analysis) and Stateflow control logic in MATLAB/Simulink.
Quadcopter Flight Control System (Simulink)
Complete flight control architecture for a quadcopter in X-configuration using MATLAB Simulink. Features layered PID loops, 6DOF simulation, trajectory tracking, motor mixing, and 3D visualization.
Robotic Trash Removal with Autonomous Surface Vessel
Designed a ROS-Gazebo simulation on the Heron USV to perform lawnmower-style surveys with constraint-aware detours for opportunistic trash interception. Validated perception-driven control with 100% waypoint recovery and deployed YOLOv8 on an ASV for real-time detection under natural lighting.
Autonomous Drone for Geological Mapping and Landing
Fully autonomous RGBD drone mission using ROS 2 + PX4 for geological search and precision landing. Combines ArUco-based pose estimation, boustrophedon coverage, and descent logic to identify and land on the tallest cylinder.
AI Scheduling Assistant
A full-stack conversational AI for Google Calendar. Parses natural language commands using the Gemini API and Google Cloud Functions to manage events, showcasing secure, end-to-end cloud application development.
Self-Taught Reasoning (STaR) for Math Problems
Implementation of the STaR paper on the GSM8k dataset. The Llama 3 model bootstraps its own reasoning ability by learning from its self-generated correct answers, outperforming baseline fine-tuning.
Publications & Achievements
Publications
Towards Robotic Trash Removal with Autonomous Surface Vessels
Robots in the Wild Workshop, IEEE ICRA 2025 - Accepted & Presented
A state-of-the-art review on robotics in waste sorting...
Intl. Journal on Interactive Design & Manufacturing (IJIDeM), 2023
Overview of Autonomous Vehicles and Its Challenges
Techno-Societal 2022. ICATSA 2022. Springer, Cham
Awards & Achievements
Academic Honors & Scholarships
- NAMU University Scholarship ($10,000)
- ASU Engineering Fellowship ($1,000)
- Graduated as Gold Medalist (B.Tech)
- 3x Merit Scholarship Holder ($3,750)
Featured by Elephant Robotics Company on Hackster.io
My Project Maze Navigation User Case Based on Robot Vision System was recognized and published by Elephant Robotics on Hackster.io as a featured innovation in robotic vision.
View FeatureInnovationHacks 2025: Pitch Perfect
- 🥇 Best Use of MATLAB by MathWorks: Used MATLAB for facial verification comparing profile images with live video frames.
- 🥈 2nd Prize in Qruil: Job Board Features: Built a smart interview system analyzing sentiment, relevance, and behavior.
Resume
Download my resume for a comprehensive overview of my qualifications, experience, and skills.
Let's Collaborate
I love collaborating on cutting-edge robotics and computer vision systems. Whether you're working on autonomous systems, digital twins, or AI-driven robotics, I’d love to connect. Let’s build something intelligent together.