{"id":192852,"date":"2025-05-05T10:04:23","date_gmt":"2025-05-05T14:04:23","guid":{"rendered":"https:\/\/www.hajim.rochester.edu\/senior-design-day\/?p=192852"},"modified":"2026-05-04T10:08:18","modified_gmt":"2026-05-04T14:08:18","slug":"ml-driven-wirelessly-controlled-robot","status":"publish","type":"post","link":"https:\/\/www.hajim.rochester.edu\/senior-design-day\/ml-driven-wirelessly-controlled-robot\/","title":{"rendered":"ML-Driven Wirelessly Controlled Robot"},"content":{"rendered":"\n<h3 class=\"wp-block-heading\">Team Members<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Ethan Lin<\/li>\n\n\n\n<li>Matthew Lucht<\/li>\n\n\n\n<li>Leon Peng<\/li>\n\n\n\n<li>Ekko Wu<\/li>\n\n\n\n<li>Andrew Yuan<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\">Advisors<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Irving Barron<\/li>\n\n\n\n<li>Jack Motley<\/li>\n\n\n\n<li>Daniel Phinney<\/li>\n<\/ul>\n\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"714\" height=\"1024\" src=\"https:\/\/www.hajim.rochester.edu\/senior-design-day\/wp-content\/uploads\/2025\/05\/Robot1-714x1024.png\" alt=\"\" class=\"wp-image-211942\" srcset=\"https:\/\/www.hajim.rochester.edu\/senior-design-day\/wp-content\/uploads\/2025\/05\/Robot1-714x1024.png 714w, https:\/\/www.hajim.rochester.edu\/senior-design-day\/wp-content\/uploads\/2025\/05\/Robot1-209x300.png 209w, https:\/\/www.hajim.rochester.edu\/senior-design-day\/wp-content\/uploads\/2025\/05\/Robot1-768x1101.png 768w, https:\/\/www.hajim.rochester.edu\/senior-design-day\/wp-content\/uploads\/2025\/05\/Robot1-1071x1536.png 1071w, https:\/\/www.hajim.rochester.edu\/senior-design-day\/wp-content\/uploads\/2025\/05\/Robot1-1428x2048.png 1428w, https:\/\/www.hajim.rochester.edu\/senior-design-day\/wp-content\/uploads\/2025\/05\/Robot1-1920x2754.png 1920w\" sizes=\"auto, (max-width: 714px) 100vw, 714px\" \/><\/figure>\n\n\n\n<h2 class=\"wp-block-heading\">Abstract<\/h2>\n\n\n\n<p>In this project, we have designed an \u201cML-Driven Wirelessly Controlled Robot\u201d that uses machine learning to detect objects of a given class and automatically navigate towards the nearest object of that class found within its camera view. The robot does not use any external sensors, detectors, or modules to perform this task, and is able to detect over 80 different classes of objects. The robot can be controlled via three modes: Physical Joystick, ML Object Detection, and Web\u2011Based Virtual Joystick. The goal of this project is to serve as a starting point for designing a robot which may be used to find or travel to objects on campus, potentially for use in transporting items to specific locations, or automatically tracking lost items without human effort.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Design Pipeline<\/h2>\n\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"549\" src=\"https:\/\/www.hajim.rochester.edu\/senior-design-day\/wp-content\/uploads\/2025\/04\/pipelinepicture-1024x549.png\" alt=\"\" class=\"wp-image-210242\" srcset=\"https:\/\/www.hajim.rochester.edu\/senior-design-day\/wp-content\/uploads\/2025\/04\/pipelinepicture-1024x549.png 1024w, https:\/\/www.hajim.rochester.edu\/senior-design-day\/wp-content\/uploads\/2025\/04\/pipelinepicture-300x161.png 300w, https:\/\/www.hajim.rochester.edu\/senior-design-day\/wp-content\/uploads\/2025\/04\/pipelinepicture-768x412.png 768w, https:\/\/www.hajim.rochester.edu\/senior-design-day\/wp-content\/uploads\/2025\/04\/pipelinepicture.png 1533w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<h3 class=\"wp-block-heading\">Hardware Components<\/h3>\n\n\n\n<ul class=\"wp-block-list\">\n<li>\u201cRhino\u201d Robot<\/li>\n\n\n\n<li>Nvidia Jetson Nano<\/li>\n\n\n\n<li>HERO Control Board<\/li>\n\n\n\n<li>UART Pin Connector<\/li>\n\n\n\n<li>Laptop<\/li>\n\n\n\n<li>Logitech Camera<\/li>\n<\/ul>\n\n\n\n<p>From a high level, the computer uses machine learning to detect objects of interest within the camera\u2019s view, which is mounted onto the robot. The object with the highest confidence rating is selected, and the speed and angle at which the robot must move to reach the object of interest is calculated. This information is then published to the Nano via UDP, which sends the required speed and angle to the HERO control board as a string through UART. The HERO board is configured to receive this string and move the wheels of the robot accordingly; this process is then repeated, with the camera adjusting the necessary speed and angle, and sending the adjusted values incrementally through the pipeline. Additionally, user commands from a web-based virtual joystick hosted by a computer are received by a Flask server on the computer and forwarded via the same UDP\/UART pipeline to the Nano, providing remote control functionality.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Software<\/h2>\n\n\n\n<h3 class=\"wp-block-heading\">Jetson Nano<\/h3>\n\n\n\n<p>The Nano is used as a bridge which allows the machine learning algorithm to more easily communicate with the HERO. The Nano receives speed and angle inputs sent out by the machine learning algorithm from the computer wirelessly, and publishes those inputs to the HERO Board on the robot using UART. A segment of the UART program is shown below:<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>uart = serial.Serial('\/dev\/ttyTHS1', baudrate=115200)\nsock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)\nsock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)\nsock.bind(('0.0.0.0', 8888))\n\nprint(\"Jetson is listening on port 8888\")\n\nwhile True:\n    data, addr = sock.recvfrom(1024)\n    command = data.decode('utf').strip()\n    print(command)\n    uart.write((command  + '\\n').encode('utf-8'))<\/code><\/pre>\n\n\n\n<p>The Nano is explicitly designed to run machine learning programs, and is able to directly interface with our camera; theoretically, the machine learning should be able to run on the Nano locally without the need for a connected laptop. However, the program simply did not run fast enough with just the Nano; for all intents and purposes, it is simply more efficient to use a computer to run the machine learning model, though it is possible that optimizations could be made which allow the Nano to run faster.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">HERO Board<\/h3>\n\n\n\n<p>The HERO Board is embedded in the robot, and communicates directly with the motor of the robot. It is programmed in C# to read speed and angle inputs, and rotate the motors in accordance with these parameters. Speed and angular velocity are read by the program as \u201cv\u201d and \u201cw\u201d, respectively; the motors which they control are denoted as \u201ctalons\u201d. A snippet of the code which converts v and w is shown below:<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>static void Drive(float v, float w, bool isUART)\n        {\n            if (isUART)\n                v = -v;\n\n            float leftThrot = v + w;\n            float rightThrot = v - w;\n\n            leftThrot = Clamp(leftThrot, -1.0f, 1.0f);\n            rightThrot = Clamp(rightThrot, -1.0f, 1.0f);\n\n            left.Set(ControlMode.PercentOutput, leftThrot);\n            leftSlave.Set(ControlMode.PercentOutput, leftThrot);\n            right.Set(ControlMode.PercentOutput, -rightThrot);\n            rightSlave.Set(ControlMode.PercentOutput, -rightThrot);\n        }<\/code><\/pre>\n\n\n\n<p>Independently of the machine learning model, speed and angular velocity may be inputted manually via Physical Joystick, the ML Object Detection mode, or the Web-Based Virtual Joystick; the machine learning automatically sends these inputs derived from the camera input using the Nano, as previously described.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Machine Learning Model<\/h3>\n\n\n\n<p>We are deploying the computer vision model from Ultralytics, YOlOv8. YOLOv8, pre-trained with COCO-2017, can detect 80 classes. There are 5 yolov8 models, based on the parameter numbers. We are using the medium version, which has 25M parameters. A post-processing is combined with YOLOv8m to do the Computer Vision work and send v and w to the control. Here is a brief Pipeline: Video as input from camera, processed by yolov8m model and grouping all detected objects into target and obstacles, generating v and w based on where the target appearing on the camera and also where the obstacles appearing on the camera, sending v and w as string to the control. In the post-processing,\u00a0a state machine is used. There are 4 states: Seek, Backup, Avoid, Find. Seek is the state that the robot see the target and try to approach it; Backup is the state that the robot find obstacle being too close and move backward to keep safe distance; Avoid is the state that the robot turns left and right, and try to pass the obstacle; Find is the state that the robot lost the target and try to find it back based on where the target disappeared. All four targets are triggered under certain conditions and run smoothly with each other to achieve the robot approaching target task.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Joystick Web Controller<\/h2>\n\n\n\n<p>In addition to the machine learning model, the robot may also be controlled remotely from an external website. The web joystick runs in a web browser and connects to a Flask server on the control computer to provide truly real\u2011time robot control. On the page you see two virtual joysticks: one that moves up and down to set the robot\u2019s linear velocity (v) and another that moves left and right to set its angular velocity (w). Every 50\u202fms, the browser reads both stick positions, calculates v and w, automatically flipping w when you drive backward so turning feels natural. The v and w values are displayed in the center of the interface, and sends them via HTTP to Flask. The server immediately wraps them into a UDP packet and forwards it to the Jetson Nano. When you release the stick, it smoothly drifts back to center and resets its value to zero, bringing the robot to a stop. To suit different driving styles we implemented four speed modes, low (|v| \u2264 0.3), medium (|v| \u2264 0.6), high (|v| \u2264 0.8), and turbo (|v| \u2264 1.0), all selectable in the UI. We tested the system over the university\u2019s campus Wi\u2011Fi and confirmed stable, responsive control anywhere within coverage.<\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"357\" src=\"https:\/\/www.hajim.rochester.edu\/senior-design-day\/wp-content\/uploads\/2025\/04\/controller-1024x357.png\" alt=\"\" class=\"wp-image-210532\" srcset=\"https:\/\/www.hajim.rochester.edu\/senior-design-day\/wp-content\/uploads\/2025\/04\/controller-1024x357.png 1024w, https:\/\/www.hajim.rochester.edu\/senior-design-day\/wp-content\/uploads\/2025\/04\/controller-300x104.png 300w, https:\/\/www.hajim.rochester.edu\/senior-design-day\/wp-content\/uploads\/2025\/04\/controller-768x267.png 768w, https:\/\/www.hajim.rochester.edu\/senior-design-day\/wp-content\/uploads\/2025\/04\/controller.png 1192w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<h2 class=\"wp-block-heading\">Results<\/h2>\n\n\n\n<p>The program runs smoothly, with the robot able to successfully navigate towards any of the objects identifiable by the machine learning algorithm. Below are video examples of the robot navigating towards a person:<\/p>\n\n\n\n<figure class=\"wp-block-video\"><video height=\"960\" style=\"aspect-ratio: 540 \/ 960;\" width=\"540\" controls src=\"https:\/\/www.hajim.rochester.edu\/senior-design-day\/wp-content\/uploads\/2025\/05\/IMG_8271-1.mp4\"><\/video><\/figure>\n\n\n\n<figure class=\"wp-block-video\"><video height=\"960\" style=\"aspect-ratio: 540 \/ 960;\" width=\"540\" controls src=\"https:\/\/www.hajim.rochester.edu\/senior-design-day\/wp-content\/uploads\/2025\/05\/IMG_8271.mp4\"><\/video><\/figure>\n\n\n\n<p>In the event multiple objects are detected within the camera view, the object with the highest confidence will be selected, and moved towards (as in, the robot will move towards the object with the highest probability of actually being the object that the machine learning model is looking for).<\/p>\n\n\n\n<p>The obstacle avoidance program is able to navigate the robot around obstacles in the way of an object by detecting them with the camera, backing the robot up and turning around them. This works by the machine learning program reactively switching between &#8220;SEEK&#8221; mode where it drives toward the selected object, &#8220;BACKUP&#8221; mode where it turns away from detected obstacles, and &#8220;FIND&#8221; mode where it turns the camera, searches for the original target object. Once the object is found, the machine learning is switched back into &#8220;SEEK&#8221; mode and resumes driving toward it. Below are videos of robot successfully navigating around simulated obstacles while moving toward a chair:<\/p>\n\n\n\n<figure class=\"wp-block-video\"><video height=\"1280\" style=\"aspect-ratio: 720 \/ 1280;\" width=\"720\" controls src=\"https:\/\/www.hajim.rochester.edu\/senior-design-day\/wp-content\/uploads\/2025\/05\/Obstacles.mp4\"><\/video><\/figure>\n\n\n\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<div class=\"wp-embed\"><div class=\"wp-embed-wrap\"><iframe loading=\"lazy\" title=\"Obstacle Avoidance\" width=\"1062\" height=\"597\" src=\"https:\/\/www.youtube.com\/embed\/dQPN_8HrTfE?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe><\/div><\/div>\n<\/div><\/figure>\n\n\n\n<p>The Website Joystick UI also works, with each speed button successfully configuring the speed of the robot. A video of the robot being controlled via. Joystick Website is shown below:<\/p>\n\n\n\n<figure class=\"wp-block-video\"><video height=\"720\" style=\"aspect-ratio: 1280 \/ 720;\" width=\"1280\" controls src=\"https:\/\/www.hajim.rochester.edu\/senior-design-day\/wp-content\/uploads\/2025\/05\/joystickvideo.mp4\" playsinline><\/video><\/figure>\n\n\n\n<h2 class=\"wp-block-heading\">Further Work<\/h2>\n\n\n\n<p>As described previously, the machine learning model should be able to run on the Nano alone; in testing, this was simply too slow. More work may be done to optimize the program, such that the Nano may need to make less computations per iteration, so that it may run the program at an acceptable speed. The machine learning model may be tuned to increase the number of detectable classes of objects, or selectively select certain objects of the same class over another when multiple are detected at once, i.e. choosing one chair over another based on characteristics inputted by a user. An option to activate the machine learning \u201cmode\u201d via the joystick website could also be added, with more time. Lastly, the same pipeline could potentially be used for any device where a camera is mountable, as long as the board which controls the movement of said device is configured correctly. Adjustments could be made such that a drone or different robot would be able to perform the same navigation, which could be an interesting project for future teams.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Archive of Work<\/h2>\n\n\n\n<p>Below is a link to our GitHub repo, where all the code we did for this project is archived. <a href=\"https:\/\/github.com\/JiajunWu107\/TrackRobot\/tree\/main\">https:\/\/github.com\/JiajunWu107\/TrackRobot\/tree\/main<\/a><\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Acknowledgements<\/h2>\n\n\n\n<p>We would like to thank Professor Mottley, Professor Barron, and Professor Phinney for their guidance and technical assistance. We would also like to thank Tabib Wasit Rahman for his expertise with programming the robot, and his great help in developing our project pipeline.<\/p>\n\n\n\n<p><br><\/p>\n","protected":false},"excerpt":{"rendered":"<p>We designed an ML-Driven Wirelessly Controlled Robot which uses machine learning to detect objects of a given class and automatically navigate towards the closest one within its camera view. The robot is able to detect over 80 different classes of objects, and can also be controlled via three modes: Physical Joystick, ML Object Detection, and Web\u2011Based Virtual Joystick.<\/p>\n","protected":false},"author":16772,"featured_media":211942,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_coblocks_attr":"","_coblocks_dimensions":"","_coblocks_responsive_height":"","_coblocks_accordion_ie_support":"","_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"footnotes":""},"categories":[4442,116,3006,8772],"tags":[],"coauthors":[21402,21412,21422,21432,21302],"class_list":["post-192852","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-archive","category-ece-archive","category-machine-learning-archive","category-robotics-archive"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.5 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>ML-Driven Wirelessly Controlled Robot - Senior Design Day<\/title>\n<meta name=\"description\" content=\"In this project, we have designed an \u201cML-Driven Wireless Control Robot\u201d that uses machine learning to detect objects of a given class and automatically navigate towards the nearest object of that class found within its camera view.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.hajim.rochester.edu\/senior-design-day\/ml-driven-wirelessly-controlled-robot\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"ML-Driven Wirelessly Controlled Robot - Senior Design Day\" \/>\n<meta property=\"og:description\" content=\"In this project, we have designed an \u201cML-Driven Wireless Control Robot\u201d that uses machine learning to detect objects of a given class and automatically navigate towards the nearest object of that class found within its camera view.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.hajim.rochester.edu\/senior-design-day\/ml-driven-wirelessly-controlled-robot\/\" \/>\n<meta property=\"og:site_name\" content=\"Senior Design Day\" \/>\n<meta property=\"article:published_time\" content=\"2025-05-05T14:04:23+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2026-05-04T14:08:18+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.hajim.rochester.edu\/senior-design-day\/wp-content\/uploads\/2025\/05\/Robot1-1200x630.png\" \/>\n\t<meta property=\"og:image:width\" content=\"1200\" \/>\n\t<meta property=\"og:image:height\" content=\"630\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"author\" content=\"Ethan Lin, Matthew Lucht, Leon Peng, Jiajun Wu, Ziyang Yuan\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Ethan Lin, Matthew Lucht, Leon Peng, Jiajun Wu, Ziyang Yuan\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"8 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\\\/\\\/www.hajim.rochester.edu\\\/senior-design-day\\\/ml-driven-wirelessly-controlled-robot\\\/#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/www.hajim.rochester.edu\\\/senior-design-day\\\/ml-driven-wirelessly-controlled-robot\\\/\"},\"author\":{\"name\":\"Ethan Lin\",\"@id\":\"https:\\\/\\\/www.hajim.rochester.edu\\\/senior-design-day\\\/#\\\/schema\\\/person\\\/d63dc18166be1cc104aea60553dd45b4\"},\"headline\":\"ML-Driven Wirelessly Controlled Robot\",\"datePublished\":\"2025-05-05T14:04:23+00:00\",\"dateModified\":\"2026-05-04T14:08:18+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/www.hajim.rochester.edu\\\/senior-design-day\\\/ml-driven-wirelessly-controlled-robot\\\/\"},\"wordCount\":1505,\"image\":{\"@id\":\"https:\\\/\\\/www.hajim.rochester.edu\\\/senior-design-day\\\/ml-driven-wirelessly-controlled-robot\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/www.hajim.rochester.edu\\\/senior-design-day\\\/wp-content\\\/uploads\\\/2025\\\/05\\\/Robot1.png\",\"articleSection\":[\"3. Programs Archive\",\"ECE Archive\",\"Machine Learning Archive\",\"Robotics Archive\"],\"inLanguage\":\"en-US\"},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/www.hajim.rochester.edu\\\/senior-design-day\\\/ml-driven-wirelessly-controlled-robot\\\/\",\"url\":\"https:\\\/\\\/www.hajim.rochester.edu\\\/senior-design-day\\\/ml-driven-wirelessly-controlled-robot\\\/\",\"name\":\"ML-Driven Wirelessly Controlled Robot - Senior Design Day\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/www.hajim.rochester.edu\\\/senior-design-day\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/www.hajim.rochester.edu\\\/senior-design-day\\\/ml-driven-wirelessly-controlled-robot\\\/#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/www.hajim.rochester.edu\\\/senior-design-day\\\/ml-driven-wirelessly-controlled-robot\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/www.hajim.rochester.edu\\\/senior-design-day\\\/wp-content\\\/uploads\\\/2025\\\/05\\\/Robot1.png\",\"datePublished\":\"2025-05-05T14:04:23+00:00\",\"dateModified\":\"2026-05-04T14:08:18+00:00\",\"author\":{\"@id\":\"https:\\\/\\\/www.hajim.rochester.edu\\\/senior-design-day\\\/#\\\/schema\\\/person\\\/d63dc18166be1cc104aea60553dd45b4\"},\"description\":\"In this project, we have designed an \u201cML-Driven Wireless Control Robot\u201d that uses machine learning to detect objects of a given class and automatically navigate towards the nearest object of that class found within its camera view.\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/www.hajim.rochester.edu\\\/senior-design-day\\\/ml-driven-wirelessly-controlled-robot\\\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/www.hajim.rochester.edu\\\/senior-design-day\\\/ml-driven-wirelessly-controlled-robot\\\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/www.hajim.rochester.edu\\\/senior-design-day\\\/ml-driven-wirelessly-controlled-robot\\\/#primaryimage\",\"url\":\"https:\\\/\\\/www.hajim.rochester.edu\\\/senior-design-day\\\/wp-content\\\/uploads\\\/2025\\\/05\\\/Robot1.png\",\"contentUrl\":\"https:\\\/\\\/www.hajim.rochester.edu\\\/senior-design-day\\\/wp-content\\\/uploads\\\/2025\\\/05\\\/Robot1.png\",\"width\":3692,\"height\":5295},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/www.hajim.rochester.edu\\\/senior-design-day\\\/ml-driven-wirelessly-controlled-robot\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\\\/\\\/www.hajim.rochester.edu\\\/senior-design-day\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"ML-Driven Wirelessly Controlled Robot\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/www.hajim.rochester.edu\\\/senior-design-day\\\/#website\",\"url\":\"https:\\\/\\\/www.hajim.rochester.edu\\\/senior-design-day\\\/\",\"name\":\"Senior Design Day\",\"description\":\"\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/www.hajim.rochester.edu\\\/senior-design-day\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Person\",\"@id\":\"https:\\\/\\\/www.hajim.rochester.edu\\\/senior-design-day\\\/#\\\/schema\\\/person\\\/d63dc18166be1cc104aea60553dd45b4\",\"name\":\"Ethan Lin\",\"url\":\"https:\\\/\\\/www.hajim.rochester.edu\\\/senior-design-day\\\/author\\\/elin17\\\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"ML-Driven Wirelessly Controlled Robot - Senior Design Day","description":"In this project, we have designed an \u201cML-Driven Wireless Control Robot\u201d that uses machine learning to detect objects of a given class and automatically navigate towards the nearest object of that class found within its camera view.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.hajim.rochester.edu\/senior-design-day\/ml-driven-wirelessly-controlled-robot\/","og_locale":"en_US","og_type":"article","og_title":"ML-Driven Wirelessly Controlled Robot - Senior Design Day","og_description":"In this project, we have designed an \u201cML-Driven Wireless Control Robot\u201d that uses machine learning to detect objects of a given class and automatically navigate towards the nearest object of that class found within its camera view.","og_url":"https:\/\/www.hajim.rochester.edu\/senior-design-day\/ml-driven-wirelessly-controlled-robot\/","og_site_name":"Senior Design Day","article_published_time":"2025-05-05T14:04:23+00:00","article_modified_time":"2026-05-04T14:08:18+00:00","og_image":[{"width":1200,"height":630,"url":"https:\/\/www.hajim.rochester.edu\/senior-design-day\/wp-content\/uploads\/2025\/05\/Robot1-1200x630.png","type":"image\/png"}],"author":"Ethan Lin, Matthew Lucht, Leon Peng, Jiajun Wu, Ziyang Yuan","twitter_card":"summary_large_image","twitter_misc":{"Written by":"Ethan Lin, Matthew Lucht, Leon Peng, Jiajun Wu, Ziyang Yuan","Est. reading time":"8 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/www.hajim.rochester.edu\/senior-design-day\/ml-driven-wirelessly-controlled-robot\/#article","isPartOf":{"@id":"https:\/\/www.hajim.rochester.edu\/senior-design-day\/ml-driven-wirelessly-controlled-robot\/"},"author":{"name":"Ethan Lin","@id":"https:\/\/www.hajim.rochester.edu\/senior-design-day\/#\/schema\/person\/d63dc18166be1cc104aea60553dd45b4"},"headline":"ML-Driven Wirelessly Controlled Robot","datePublished":"2025-05-05T14:04:23+00:00","dateModified":"2026-05-04T14:08:18+00:00","mainEntityOfPage":{"@id":"https:\/\/www.hajim.rochester.edu\/senior-design-day\/ml-driven-wirelessly-controlled-robot\/"},"wordCount":1505,"image":{"@id":"https:\/\/www.hajim.rochester.edu\/senior-design-day\/ml-driven-wirelessly-controlled-robot\/#primaryimage"},"thumbnailUrl":"https:\/\/www.hajim.rochester.edu\/senior-design-day\/wp-content\/uploads\/2025\/05\/Robot1.png","articleSection":["3. Programs Archive","ECE Archive","Machine Learning Archive","Robotics Archive"],"inLanguage":"en-US"},{"@type":"WebPage","@id":"https:\/\/www.hajim.rochester.edu\/senior-design-day\/ml-driven-wirelessly-controlled-robot\/","url":"https:\/\/www.hajim.rochester.edu\/senior-design-day\/ml-driven-wirelessly-controlled-robot\/","name":"ML-Driven Wirelessly Controlled Robot - Senior Design Day","isPartOf":{"@id":"https:\/\/www.hajim.rochester.edu\/senior-design-day\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.hajim.rochester.edu\/senior-design-day\/ml-driven-wirelessly-controlled-robot\/#primaryimage"},"image":{"@id":"https:\/\/www.hajim.rochester.edu\/senior-design-day\/ml-driven-wirelessly-controlled-robot\/#primaryimage"},"thumbnailUrl":"https:\/\/www.hajim.rochester.edu\/senior-design-day\/wp-content\/uploads\/2025\/05\/Robot1.png","datePublished":"2025-05-05T14:04:23+00:00","dateModified":"2026-05-04T14:08:18+00:00","author":{"@id":"https:\/\/www.hajim.rochester.edu\/senior-design-day\/#\/schema\/person\/d63dc18166be1cc104aea60553dd45b4"},"description":"In this project, we have designed an \u201cML-Driven Wireless Control Robot\u201d that uses machine learning to detect objects of a given class and automatically navigate towards the nearest object of that class found within its camera view.","breadcrumb":{"@id":"https:\/\/www.hajim.rochester.edu\/senior-design-day\/ml-driven-wirelessly-controlled-robot\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.hajim.rochester.edu\/senior-design-day\/ml-driven-wirelessly-controlled-robot\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.hajim.rochester.edu\/senior-design-day\/ml-driven-wirelessly-controlled-robot\/#primaryimage","url":"https:\/\/www.hajim.rochester.edu\/senior-design-day\/wp-content\/uploads\/2025\/05\/Robot1.png","contentUrl":"https:\/\/www.hajim.rochester.edu\/senior-design-day\/wp-content\/uploads\/2025\/05\/Robot1.png","width":3692,"height":5295},{"@type":"BreadcrumbList","@id":"https:\/\/www.hajim.rochester.edu\/senior-design-day\/ml-driven-wirelessly-controlled-robot\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.hajim.rochester.edu\/senior-design-day\/"},{"@type":"ListItem","position":2,"name":"ML-Driven Wirelessly Controlled Robot"}]},{"@type":"WebSite","@id":"https:\/\/www.hajim.rochester.edu\/senior-design-day\/#website","url":"https:\/\/www.hajim.rochester.edu\/senior-design-day\/","name":"Senior Design Day","description":"","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.hajim.rochester.edu\/senior-design-day\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Person","@id":"https:\/\/www.hajim.rochester.edu\/senior-design-day\/#\/schema\/person\/d63dc18166be1cc104aea60553dd45b4","name":"Ethan Lin","url":"https:\/\/www.hajim.rochester.edu\/senior-design-day\/author\/elin17\/"}]}},"_links":{"self":[{"href":"https:\/\/www.hajim.rochester.edu\/senior-design-day\/wp-json\/wp\/v2\/posts\/192852","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.hajim.rochester.edu\/senior-design-day\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.hajim.rochester.edu\/senior-design-day\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.hajim.rochester.edu\/senior-design-day\/wp-json\/wp\/v2\/users\/16772"}],"replies":[{"embeddable":true,"href":"https:\/\/www.hajim.rochester.edu\/senior-design-day\/wp-json\/wp\/v2\/comments?post=192852"}],"version-history":[{"count":32,"href":"https:\/\/www.hajim.rochester.edu\/senior-design-day\/wp-json\/wp\/v2\/posts\/192852\/revisions"}],"predecessor-version":[{"id":213472,"href":"https:\/\/www.hajim.rochester.edu\/senior-design-day\/wp-json\/wp\/v2\/posts\/192852\/revisions\/213472"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.hajim.rochester.edu\/senior-design-day\/wp-json\/wp\/v2\/media\/211942"}],"wp:attachment":[{"href":"https:\/\/www.hajim.rochester.edu\/senior-design-day\/wp-json\/wp\/v2\/media?parent=192852"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.hajim.rochester.edu\/senior-design-day\/wp-json\/wp\/v2\/categories?post=192852"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.hajim.rochester.edu\/senior-design-day\/wp-json\/wp\/v2\/tags?post=192852"},{"taxonomy":"author","embeddable":true,"href":"https:\/\/www.hajim.rochester.edu\/senior-design-day\/wp-json\/wp\/v2\/coauthors?post=192852"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}