Accuracy: 0%
Epochs:
-press play to watch a neural network learn-
Given the set of colored points above, a neural network attempts to learn a model which can predict the color of any new point. Network predictions are computed on a grid and are shown by the background color. Color darkness indicates confidence level.
The neural network consists of an input layer (2 or 4 nodes), two hidden layers with size 12 and 6, and an output layer of size 3. Both input types are included to show the importance of feature selection in model learning. While the network should be able to learn the correct decision boundary for the square and circle datasets given just the (x, y) pairs, it sometimes struggles and takes many thousands of epochs. These datasets become relatively simple given the squared input feature dimensions.

The neural network is not optimized in any way. It is about as simple as possible while still being able to provide training functionality. The goal was not to make the fastest, most accurate classifier, but rather to make a simple visualization showing the progression of a neural network trying to learn a decision boundary. Sometimes, for the simple (x, y) input, it's interesting to see the network slowly find the correct model.

Training stops after 10,000 epochs or after the network reaches 95 percent accuracy on the training set. After every 10 epochs, the network waits 0.1 seconds before continuing to give time for the decision boundary to render. Training also stops if the canvas changes size or if a different configuration is chosen.

Computer Science Ph.D. student at the University of Wisconsin-Madison

I'm a third year Ph.D. student working under the supervision of Prof. Theodoros (Theo) Rekatsinas. My current research focuses on large-scale training of Graph Neural Networks and Graph Embeddings. Together with Jason Mohoney we are developing the Marius system for deep learning over billion-scale graphs.

Beyond CS, I have a background in physics and engineering. I am excited about current developments in fields such as renewable energy, battery technology, electric motors, plasma science, and fusion energy.

## Education

Sep. 2015 - May 2019
B.S. – Applied Mathematics,
Engineering, and Physics (AMEP)

Overall GPA: 4.00/4.00
Sep. 2015 - May 2019
B.S. – Computer Science
(second major)

Overall GPA: 4.00/4.00
Sep. 2019 – Present
Enrolled in
Computer Science Ph.D. program

Masters Expected May 2022

## Selected Experience

Research Assistant
Graduate student in computer science at the University of Wisconsin-Madison. Worked on neural network compression during training and systems for deep learning over large-scale graphs. Current work focuses on scaling training of graph ML models on a single machine using the entire memory hierarchy, including disk.
Aug. 2019

Present
Jun. 2018

Aug. 2018
Amazon
Software Development Engineer – Intern
Designed and built a server-less application on Amazon Web Services (AWS) to connect Amazon.com customer accounts to corresponding customer accounts at an Amazon subsidiary. Strongly considered scalability of application. Created dashboards, alarms, and documentation for the product. Product continues to operate in production and is stable (checked as of 02/28/2019). Worked with a team of seven other developers.
Wisconsin Plasma Physics Lab @ UW-Madison
Worked with Prof. Cary Forest and graduate students to design next generation mirror plasma confinement devices. Progressed and developed magnetic and plasma equilibrium software to aid in geometry and stability analysis. Additional work consisted of planning and building equipment for experiments on the Big Red Ball (BRB) including vacuum parts, probes, power supplies, and circuits.
Jan. 2016

May 2018

## Awards

2019-2020
Fellowship awarded to the very top students admitted to the UW-Madison CS graduate program. The award covers salary, tuition remission, and health benefits.
Spring 2019
Pitched a startup proposal for autonomous EEG monitoring and seizure detection. Judged against ~20 other teams. Team consisted of myself (presenter, lead), Jason Mohoney, Neil Klingensmith, and Dr. Aaron Struck.
Goldwater Scholarship
2018
National award termed “the most prestigious undergraduate scholarship in STEM”. One of a maximum of four nominations from UW-Madison and one of 211 award winners.
Hilldale Research Fellowship
2017
Provided funding for undergraduate research at UW-Madison on work entitled “Axisymmetric Spherical Mirror for a Fusion Neutron Source”.
Ingersoll Award
Fall 2015
Awarded to the top student in an introductory physics class at UW-Madison. Recipient for work in Physics 247: A Modern Introduction to Physics. Accelerated Honors Section.
William F. Vilas Scholarship
Fall 2015, 2016, 2017, 2018

## Publications

Computer Science
Waleffe, R., Mohoney, J., Rekatsinas, T., Venkataraman, S. Marius++: Large-Scale Training of Graph Neural Networks on a Single Machine. In Submission. 2022.
Xie, A., Carlsson, A., Mohoney, J., Waleffe, R., Peters, S., Rekatsinas, T., Venkataraman, S. Demo of Marius: A System for Large-scale Graph Embeddings. Proceedings of the VLDB Endowment, 14(12). 2021.
Mohoney, J., Waleffe, R., Xu, Y., Rekatsinas, T., Venkataraman, S. Marius: Learning Massive Graph Embeddings on a Single Machine. 15th Symposium on Operating Systems Design and Implementation. 2021.
Waleffe, R., Rekatsinas, T. Principal Component Networks: Parameter Reduction Early in Training. 2020.
Plasma Physics
Peterson, E. E., Endrizzi, D. A., Beidler, M., Bunkers, K. J., Clark, M., Egedal, J., Flanagan, K., McCollam, K. J., Milhone, J., Olson, J., Sovinec, C. R., Waleffe, R., Wallace, J., Forest, C. B. A laboratory model for the Parker spiral and magnetized stellar winds. Nature Physics. 2019.
Media coverage: PBS, Quanta, Wired, Cosmos, Science News, UW-Madison
Brookhart, M. I., Stemo, A., Waleffe, R., Forest, C. B. Driving magnetic turbulence using flux ropes in a moderate guide field linear system. Journal of Plasma Physics. 2017.
Presentations
Waleffe, R., Peterson, E. E., Anderson, J., Clark, M., Wallace, J., Forest, C. B. Investigation of magnetic mirror configurations at the WiPPL facility and their applications. 60th Annual Meeting of the APS Division of Plasma Physics. 2018.
Poster
Presented by Anderson, J.
Waleffe, R., Perterson, E. E., Mirnov, V., Forest, C. B. Stability of a axisymmetric, non-paraxial mirror and its applications for a fusion neutron source. 59th Annual Meeting of the APS Division of Plasma Physics. 2017.
Poster
Waleffe, R., Perterson, E. E., Forest, C. B. Spherical Mirror as a Fusion Neutron Source. Meeting of the Physics Department Board of Visitors. 2017.
Talk
Waleffe, R., Endrizzi, D. A., Peterson, E. E., Forest, C. B. High Current, High Density Arc Plasma Source for WiPAL. 58th Annual Meeting of the APS Division of Plasma Physics. 2016.
Poster