Mechanical Engineering • Robotics • Aerospace Systems

Marcus Greenan

UC San Diego • Mechanical Engineering • Robotics and Controls

Mechanical engineering student building robotics hardware for robotics, autonomy, and hardware internships.

I work on systems that have to run on real hardware: CAD, brackets, wiring, sensing, ROS2 interfaces, and the tuning work that gets a robot from partial subsystems to repeatable behavior.

Most of my work is in robotics, controls, fabrication, and systems integration, especially when packaging, bandwidth, power, and field testing all matter at once.

Engineering Snapshot
Primary focus
Robotics, controls, and hardware integration
Applied areas
Mobile robotics, machine shop work, manipulator packaging, and test-driven iteration
Target roles
Robotics / hardware internships across autonomy, aerospace, defense, and advanced manufacturing
Robotics / Hardware Internships
Target roles
ROS2 + CAD + Fabrication
Systems stack
UC San Diego
Mechanical Engineering
GPA 3.57
Provost Honors
Expected June 2027
Graduation timeline
Profile

I build systems and keep working on them until the mechanical layout, sensing, and control behavior make sense on real hardware.

I am a mechanical engineering student at UC San Diego specializing in robotics and controls. I work best on projects where the mechanical, software, and electrical pieces all affect each other and someone has to make the whole thing behave.

Headshot of Marcus Greenan

UC San Diego | Robotics and Controls

Marcus Greenan cutting sheet metal in the machine shop
Machine shop work is part of the portfolio, not a side note. I use fabrication experience to make packaging, fixturing, and tolerance decisions more realistic.

That usually means moving between CAD, printed or machined parts, sensor placement, ROS2 interfaces, and repeated bench or field tests. I care about whether the package is serviceable, whether the sensing geometry matches the control logic, and whether the system still works once power and bandwidth become real constraints.

I am targeting robotics and hardware internships where hands-on integration matters: mobile robotics, autonomy, aerospace hardware, defense systems, and advanced manufacturing.

Education

University of California, San Diego

B.S. Mechanical Engineering

Specialization in Robotics and Controls

Provost Honors

Expected Graduation: June 2027

ControlsDynamicsSolid MechanicsFluid MechanicsThermodynamicsMaterials Science
Robotics Integration

From sensors to actuator behavior

I work at the point where camera geometry, LiDAR processing, ROS2 messaging, and physical packaging directly affect how the robot moves.

Mechanical Execution

CAD, hardware packaging, and fabrication

I design brackets, mounts, and integrated assemblies with service access, stiffness, wiring paths, and manufacturability in mind.

Controls and Validation

Test until the system explains itself

I tune controllers from logged behavior and iterate from failure modes instead of assuming the first architecture is the right one.

Machine Shop

Prints, setup, and recuts

I work in manual machining environments where bad fixturing, loose tolerances, and rushed setup decisions show up immediately in the part.

Projects

These are the projects that best show how I work: build the system, test it, and fix what breaks.

MAE 148 is first because it is the clearest example of the robotics work I want to do: sensing, control, packaging, and hardware integration under real constraints.

1
Flagship mobile robotics system
3
Core engineering case studies
2
Projects with real hardware validation
Flagship ProjectUCSD MAE 148Winter 2026

MAE 148 - Autonomous Trash Collection Robot

Built an autonomous robot that could detect trash, drive to it, stop at the right distance, and pick it up on real hardware

I built and integrated an autonomous trash collection robot that used OAK-D Lite vision, YOLO-based detection, LD19 LiDAR stop logic, ROS2 /cmd_vel control, and a custom-mounted SO101 robotic arm. I worked on the LiDAR stop logic, arm integration, and CAD for the system, and then tested the package on the real vehicle until the sensing and control behavior lined up.

ROS2OAK-D LiteYOLOLD19 LiDARSolidWorksRaspberry Pi 5/cmd_vel control
System
Vision-guided mobile manipulation
Stop Logic
Forward-cone clustering with ~0.16 m standoff
Primary Ownership
LiDAR processing, arm integration, CAD, and packaging
Custom SO101 robotic arm CAD used for the MAE 148 autonomous trash collection robot
Arm package and mounting geometry for the MAE 148 autonomous trash collection robot.
OAK-D Lite detection output used for target following in the MAE 148 robot
OAK-D Lite detection output used to generate image-centroid steering commands.
LD19 LiDAR forward-cone clustering visualization used for MAE 148 stop control
Forward-cone LD19 clustering reduced to the nearest centroid distance for stop control.
Overview

I built an autonomous robot that could detect trash, drive into a manipulation-ready pose, stop at the right distance, and hand off to an onboard arm for pickup and deposit. The stack used a YOLO-derived detector on an OAK-D Lite, ROS2 velocity control on the mobile base, and LD19 LiDAR processing for final standoff control.

The hard part was not getting one subsystem to work by itself. I had to get the sensing geometry, packaging, update rates, and mechanical layout to support the same pickup sequence on the real vehicle.

My Contributions

I worked on the LiDAR coding and processing, LiDAR-based stop logic, robotic arm integration, and the CAD and hardware packaging for the manipulator and sensor stack.

That meant writing and tuning the forward-cone clustering logic, choosing where sensors sat relative to the chassis and arm, and making packaging decisions that improved both reach and controllability.

  • Filtered LD19 scans to a forward cone and converted clustered returns into a centroid-based stop measurement.
  • Integrated the SO101 arm and mounting hardware into the vehicle package.
  • Worked the sensor-placement problem so visual steering and LiDAR stopping referenced usable geometry.
System Architecture

The robot followed a staged reactive pipeline: OAK-D Lite detection picked the target, image-centroid error generated steering corrections, LD19 LiDAR estimated forward standoff distance, and a ROS2 node published /cmd_vel until the robot reached a manipulation-ready position.

We kept the architecture light enough for the platform. Inference ran on the OAK-D Myriad X accelerator to reduce USB load, queue depth stayed low to keep the data fresh, and the controller stayed simple enough that we could tune it on hardware.

Engineering Tradeoffs

Sensor placement changed control performance directly. If the camera and LiDAR axes drifted apart mechanically, the robot could center a target visually while stopping relative to the wrong physical line. That made sensor packaging part of the controls problem, not just a mounting task.

The other major constraints were USB bandwidth, compute, and power stability. We reduced camera and LiDAR rates to about 10 Hz, ran inference on-device, and kept the controller simple because the system needed to run reliably on the hardware we had.

Outcome

The finished system could detect a target, follow it, stop before contact, and support arm-based collection from a coherent vehicle package.

It also gave me direct experience with the kind of robotics work I want to keep doing: CAD, sensing, wiring, ROS2 nodes, and controller tuning all coming together on a robot that actually runs.

Crazyflie micro-drone used in multi-agent robotics experiments
UCSD Multi-Agent Robotics Lab

Multi-Agent Robotics Coverage

Distributed coverage and controls work in simulation and Crazyflie hardware experiments

Undergraduate Researcher2025 - Present

I work on distributed coverage and control problems in the UCSD Multi-Agent Robotics Lab and test how they hold up in both simulation and hardware. Most of the work is in the loop between ROS feedback, tuning, logging, and seeing what changes once the controller leaves the clean simulation case.

Research Focus
Coverage control and hardware validation
Platform
Crazyflie micro-quadrotors
My Work
ROS feedback, tuning, logging, and experiment execution
PythonROSCrazyflieCentroid controlVoronoi coverageSystem identification
Engineering Focus
  • I use ROS and Python tooling to make runs easy to log, compare, and debug rather than treating hardware experiments as one-off demos.
  • The work combines centroid control, coverage logic, and repeatable experiment structure so controller changes can be tied to measured behavior.
Validation and Outcome
  • I compare simulation traces against Crazyflie hardware runs and use those gaps to refine controller structure and gain selection.
  • The work sharpened how I think about disturbance, measurement quality, and controller robustness on small robotic platforms.
Rocket technical diagram representing the Daedalus project
UCSD Rocket Propulsion Lab

Rocket Propulsion Lab - Daedalus

Structures work for a student rocket targeting roughly 4,000 ft apogee

Structures Lead2024 - 2025

I worked on structures for Daedalus, a student rocket project where stiffness, mass, assembly tolerance, and aerodynamic stability all had to be balanced together. I built the structural CAD, checked tolerances, and used quick FEA passes before fabrication and validation.

Vehicle Goal
~4,000 ft apogee
Primary Scope
Structures and integration
Engineering Lens
Mass, stiffness, and stability tradeoffs
SolidWorksOpenRocketTolerance analysisBasic FEA checksInstrumentation
Engineering Focus
  • I used CAD and tolerance analysis early to expose assembly problems before fabrication rather than finding them during integration.
  • Quick FEA passes and OpenRocket trade studies helped narrow concepts before the team committed time and material.
Validation and Outcome
  • I supported propulsion and recovery validation with instrumentation and data collection tied back to design assumptions.
  • The work improved how I make structural decisions in the context of the whole system instead of optimizing one part in isolation.
Experience

My experience is strongest where drawings, hardware setup, and execution quality all matter.

UCSD Jacobs Machine Shop

Tutor / Assistant

2026 - Present

I help students move from rough part concepts to manufacturable setups on manual mills and lathes. A lot of the work is catching bad assumptions before material gets cut: prints that are technically correct but hard to fixture, tolerance callouts that do not match the process, or setups that will drift once the part is clamped. I also recut failed parts and help diagnose what actually went wrong at the machine.

Manual mills and lathesGD&T interpretationFixturingTolerance troubleshooting
Joint BioEnergy Institute

Engineering Intern

2022

I ran controlled microbial biofuel experiments, kept the logging disciplined, and processed each run to compare yield and repeatability. That work reinforced habits I still use in engineering projects: isolate variables, record conditions clearly, and make conclusions traceable to the data.

Experiment designRun loggingRepeatability analysisTechnical reporting
iLAB BioTech Partners

Software Engineering Intern

2022 - 2023

I wrote Python scripts for large bioinformatics datasets and built cleaner analysis pipelines around messy inputs. The useful transfer from that role was reproducibility: clear steps, stable outputs, and enough checking that downstream work was not built on bad assumptions.

Python scriptingData pipelinesData cleaningReproducible workflows
Contact

If you are hiring for robotics or hardware work, I would like to talk.

Email is the fastest way to reach me for internship opportunities, project discussions, or direct technical conversations.

Portfolio focused on robotics, controls, fabrication, and integrated hardware systems.