Accuracy: 0%
Epochs:
-press play to watch a neural network learn-
Given the set of colored points above, a neural network attempts to learn a model which can predict the color of any new point. Network predictions are computed on a grid and are shown by the background color. Color darkness indicates confidence level.
The neural network consists of an input layer (2 or 4 nodes), two hidden layers with size 12 and 6, and an output layer of size 3. Both input types are included to show the importance of feature selection in model learning. While the network should be able to learn the correct decision boundary for the square and circle datasets given just the (x, y) pairs, it sometimes struggles and takes many thousands of epochs. These datasets become relatively simple given the squared input feature dimensions.

The neural network is not optimized in any way. It is about as simple as possible while still being able to provide training functionality. The goal was not to make the fastest, most accurate classifier, but rather to make a simple visualization showing the progression of a neural network trying to learn a decision boundary. Sometimes, for the simple (x, y) input, it's interesting to see the network slowly find the correct model.

Training stops after 10,000 epochs or after the network reaches 95 percent accuracy on the training set. After every 10 epochs, the network waits 0.1 seconds before continuing to give time for the decision boundary to render. Training also stops if the canvas changes size or if a different configuration is chosen.

About Me


Roger

Computer Science Ph.D. student at the University of Wisconsin-Madison

I'm a first year Ph.D. student working under the supervision of Prof. Theodoros (Theo) Rekatsinas. My current research focuses on understanding the role of overparameterization in neural networks and techniques to train smaller networks without sacrificing accuracy. I also work on machine learning algorithms for long, complex input sequences. This style of data is hard to model because the input may consist of multiple types (e.g. text with images), and because hidden dependencies within and across these inputs are difficult to extract (e.g. text-text and text-image dependencies).

More broadly I'm also interested in computer vision, especially with regards to self-driving, and big data systems. Beyond CS, I have a background in physics and engineering. I am excited about current developments in fields such as renewable energy, battery technology, electric motors, plasma science, and fusion energy.

Phone
(608) 228 6510
Address
1210 W Dayton St, Madison, WI 53706

Education


Sep. 2015 - May 2019
B.S. – Applied Mathematics,
Engineering, and Physics (AMEP)


University of Wisconsin-Madison
Overall GPA: 4.00/4.00
Sep. 2015 - May 2019
B.S. – Computer Science
(second major)


University of Wisconsin-Madison
Overall GPA: 4.00/4.00
Sep. 2019 – Present
Enrolled in
Computer Science Ph.D. program


University of Wisconsin-Madison
Expected May 2023

Experience


Department of Neurology @ UW-Madison
Honorary Associate
Work with Dr. Aaron Struck and Dr. Elizabeth Felton’s epilepsy research lab to continue development of automated seizure detection from electroencephalogram (EEG) recordings using machine learning models. Extends work (below) that began through undergraduate research with a graduate student in Prof. Suman Banerjee’s WiNGS Lab.
Jun. 2019

Present
Jun. 2018

May 2019
WI Wireless and NetworkinG Systems Lab @ UW-Madison
Undergraduate Researcher
Worked with Prof. Suman Banerjee and his graduate student. Implemented a long short-term memory (LSTM) network for time series prediction to improve the groups older results for scheduling water softener regeneration by 50 percent. Developed machine learning models for electroencephalogram (EEG) seizure detection using convolutional and recurrent convolutional neural networks (CNN, RCNN).
Amazon
Software Development Engineer – Intern
Designed and built a server-less application on Amazon Web Services (AWS) to connect Amazon.com customer accounts to corresponding customer accounts at an Amazon subsidiary. Strongly considered scalability of application. Created dashboards, alarms, and documentation for the product. Product continues to operate in production and is stable (checked as of 02/28/2019). Worked with a team of seven other developers.
Jun. 2018

Aug. 2018
Jan. 2016

May 2018
Wisconsin Plasma Physics Lab @ UW-Madison
Undergraduate Researcher and Engineer
Worked with Prof. Cary Forest and graduate students to design next generation mirror plasma confinement devices. Progressed and developed magnetic and plasma equilibrium software to aid in geometry and stability analysis. Additional work consisted of planning and building equipment for experiments on the Big Red Ball (BRB) including vacuum parts, probes, power supplies, and circuits.

Awards


UW-Madison CS Departmental Research Fellowship
2019-2020
Fellowship awarded to the very top students admitted to the UW-Madison CS graduate program. The award covers salary, tuition remission, and health benefits.
UW-Madison CS NEST 2nd Place
Spring 2019
Pitched a startup proposal for autonomous EEG monitoring and seizure detection. Judged against ~20 other teams. Team consisted of myself (presenter, lead), Jason Mohoney, Neil Klingensmith, and Dr. Aaron Struck.
Goldwater Scholarship
2018
National award termed “the most prestigious undergraduate scholarship in STEM”. One of a maximum of four nominations from UW-Madison and one of 211 award winners.
Hilldale Research Fellowship
2017
Provided funding for undergraduate research at UW-Madison on work entitled “Axisymmetric Spherical Mirror for a Fusion Neutron Source”.
Ingersoll Award
Fall 2015
Awarded to the top student in an introductory physics class at UW-Madison. Recipient for work in Physics 247: A Modern Introduction to Physics. Accelerated Honors Section.
William F. Vilas Scholarship
Fall 2015, 2016, 2017, 2018
Awarded to undergraduate students at UW-Madison for strong academic performance.

Publications


Publications
Peterson, E. E., Endrizzi, D. A., Beidler, M., Bunkers, K. J., Clark, M., Egedal, J., Flanagan, K., McCollam, K. J., Milhone, J., Olson, J., Sovinec, C. R., Waleffe, R., Wallace, J., Forest, C. B. A laboratory model for the Parker spiral and magnetized stellar winds. Nature Physics. 2019.
Media coverage: PBS, Quanta, Wired, Cosmos, Science News, UW-Madison
Brookhart, M. I., Stemo, A., Waleffe, R., Forest, C. B. Driving magnetic turbulence using flux ropes in a moderate guide field linear system. Journal of Plasma Physics. 2017.
Presentations
Waleffe, R., Peterson, E. E., Anderson, J., Clark, M., Wallace, J., Forest, C. B. Investigation of magnetic mirror configurations at the WiPPL facility and their applications. 60th Annual Meeting of the APS Division of Plasma Physics. 2018.
Poster
Presented by Anderson, J.
Waleffe, R., Perterson, E. E., Mirnov, V., Forest, C. B. Stability of a axisymmetric, non-paraxial mirror and its applications for a fusion neutron source. 59th Annual Meeting of the APS Division of Plasma Physics. 2017.
Poster
Waleffe, R., Perterson, E. E., Forest, C. B. Spherical Mirror as a Fusion Neutron Source. Meeting of the Physics Department Board of Visitors. 2017.
Talk
Waleffe, R., Endrizzi, D. A., Peterson, E. E., Forest, C. B. High Current, High Density Arc Plasma Source for WiPAL. 58th Annual Meeting of the APS Division of Plasma Physics. 2016.
Poster

Personal Projects