The University of Arizona
Resarch Data Support Specialist
LinkedIn Profile
Codewars Profile
Plant Anatomy Animation One
Plant Anatomy Animation Two
I'm Travis Simmons, and I am in love with the intersection of plants and computers.
I leverage computer vision, machine learning, and biology in order to develop and implement high throughput phenotyping pipelines for the worlds largest plant phenotyping robot at the University of Arizona Pauli Lab.
Over the course of my summer 2020 internship with The University of Arizona Pauli Lab, I leveraged georeferenced centroid points on our plants, that are created during the canopy area detection portion of our RGB pipeline, in order to track specific plants over time. This was done by comparing the distance between points in separate days of scans and pairing the closest points.
After becoming More comfortable with machine learning, this project was replaced with agglomerative clustering algorithms from the scikit learn library.
During my second internship with The University of Arizona Pauli Lab, I explored different methods of plant identification. In order to begin my journey into machine learning, I developed Haar Cascades along with training data sets for drone immagery of the lettuce fields at the Maricopa Agricultural Center. All cascades were trained with the opensource program 'Cascade-Trainer-GUI'.
These point clouds were generated using a Lemmnatec 3D Scanner at our field in Maricopa Arizona. I then gave each point a volume by programatically spawing a sphere at each point. The point clouds were then animated using Blender. This was a part of an effort that was featured in The Wall Street Journal.