Chase Murray

Assistant Professor
Industrial & Systems Engineering
University at Buffalo

Phone:  (716) 645-4716
Office: 309 Bell Hall

My research interests involve the application of operations research (OR) techniques to solve problems encountered by industry and the military.  In particular, I’m interested in leveraging the capabilities of (semi-) autonomous vehicles for logistics and surveillance.  This includes routing and scheduling of unmanned aerial vehicles (UAVs, also known as drones).  I am a Federal Aviation Administration (FAA) certified drone pilot.

Featured Research

Drone Flight Training and Simulation System – This application helps pilots, both new and experienced, to improve their drone-flying skills.  It was originally developed to ensure that students in the Optimator Lab at the University at Buffalo were proficient in operating a drone before flying our hardware outside.  However, we have added numerous features that make the system …read more…


Multi-user Multi-vehicle Mission Control (M3C) – The M3C package provides mission planning and execution capabilities for teams of UAVs (drones) and UGVs (rovers). We developed this system to enable command and control of autonomous vehicles. Key features of the system include …read more…


UAVs in Logistics – Motivated by Amazon’s “Prime Air” UAV for small parcel delivery, we have developed algorithms that coordinate traditional delivery trucks with quadcopters.  These algorithms minimize the total time required for deliveries, helping customers to receive packages faster and improving the overall effectiveness of the delivery process.  Read more…


Autonomous Cars – I couldn’t help but include this little video clip. This was the very first run of our autonomous “Buffalo Car” (the University at Buffalo’s version of the Donkey Car). While the Donkey Car uses machine learning (ML) algorithms to navigate, our car simply uses the on-board camera. Although the ML algorithms appear to work great (there’s a large, growing, and talented group of people working on the Donkey Car…check YouTube for some great demonstrations), they require training via a human driver. Our goal was to create a system that would either automate the training process or demonstrate a driving ability on par with the ML version. The maiden voyage of our car wasn’t spectacular, but it followed the white line (which turned out to be a surprising challenge, since the ceiling lights reflected on the shiny black tiles look like large white lines to our car’s camera) and it didn’t crash. We’ll be working on the vision system over the next few months…check back soon for more details and an updated video.