Ezri A.

Senior Year:

12/18/24 - I finished writing my paper! now testing experimentally.

11/11/24 This week I spent analyzing how I can use a the basic Taylor series expansion to build a linear approximation method. Although I derived a basic one before, I am ironing out the minor details to make sure I am multiplying by the right constants. Currently, I am thinking about using the basic definition of a derivative rather than Taylor series. The goal of the next week is to determine which one to use. Another issue I need to deal with is the change of the calculation for the coefficient of drag depending on the frame of reference for the equation itself.

11/6/24 The last month has been spent writing up the math for mapping the trajectory of the paper airplane. While doing so, I noticed a mistake that I made with the math when solving for the center of mass of the trapezoid. I also consulted with Scott on characterizing the intuitive and physical linear approximation method I derived.

9/30/24: This week was spent planning dark matter day, working on the motor, and some time helping wind tunnel. I spent most of my time researching different chemistry reactions that would look “fun” for dark matter day. One reaction that Megan pointed out was a clock reaction. So, Skylar and I will be doing clock reactions for dark matter day. For the motor, I rewired it again, including the EN+,-, ports, but it is still locking up. Finally, I added a second calibration factor for wind tunnel scale.

9/23/24: This week was spent working on the motor. The motor will look into position but won’t move. I wonder if not enough power is being pulled through the power source, or if it is a coding or wiring issue. I rewired it two different ways and checked the code. The code should be working. Another source of error may be that I am using an ESP controller instead of an Arduino, but I strongly doubt that.

9/13/24: This week was mainly spent wiring the motors for the radiotelescope. I set up the drivers for 200 micro steps and 1.5 amps. I used simple PWM commands for the code. When testing the code, I had connectivity issues with the Arduino Uno board, so I switched to a generic ESP 8266 microcontroller. The code is compiling and is being run without error on the ESP, but the motor is not turning. I suspect a faulty motor. Next week I will test the code with a different motor to see if I can rectify the issue. If that doesn’t work, I will change the power supply and see how that changes my answer, or if that doesn’t work, I will then test to see if a different code will fix the issue.

9/8/24 This week was spent planning out the right ascension mounting system and how it will pivot. I settled on an inset motor found underneath the original structure. Additionally, I spent time analyzing data from simulations lab. The error rate for z axis is 100x bigger than in the theta and x axis. I suspect a misalignment of numbers. Additionally, I don’t think the velocities are fed into the RNN, adding these velocities may rectify the issue.

9/1/24: This week has mainly been spent on the construction of the radio telescope. I created the mount for declination but have not created the mount for right ascension.

Junior Year:

4/25/24:

Poster session went well. I got no feedback but excited to keep doing my work.

4/16/24:

For the articulating radiotelescope, I cut several of the acrylic pieces needed to put it together. I also ran 100 simulations for the RNN, this gave me about 100 million datapoints. I trained the rnn and it was giving me <0.3% error from a quick eyeball test. I need to run further tests before conclusion. I also started looking through eclipse data from the radiotelescope that I brought to the deck.

3/18/24

For simulations, I set up a larger scale test for the RNN to approximate. This time, the model was given 10 simulations, with starting x and z velocities ranging from 2 - 11 meters per second. Each input of data now had four features: initial velocity of x, initial velocity of z, initial theta value, and time value. The the corresponding x, z and theta values for that time and initial velocities are the output. For a quick test, we used the LSTM RNN model to predict values for [time = 0.2, initial x velocity = 2, initial x velocity = 2, initial theta = 45]: x=0.34424400329589844, z=1.1936168670654297, theta=44.90970230102539. These numbers are fairly close to the actual values: t = 0.20000100, x = 0.34881893, z = 1.21259506, theta = 44.88457848. This was used using a batch size of 32, learning rate of 0.001, 20 epochs, and 64 RNN layers.

3/11/24:

For simulations, I set up new simulation that has ten different velocities this data will then be used as training data for the LSTM.

3/7/24:

For Simulations: I replaced the RNN layers in keras with LSTm layers. This was able to get the prediction for 0.2 seconds to Predicted values for t=0.2: x=1.2400949001312256, z=1.5510003566741943, theta=39.3616943359375.

For Radiotelescope: We tested horn and yagey. Teapot was too low.

3/4/23:

We got the radiotelescope remote desktop to work! We were able to read signal from the milkyway. One issues that still needs to be rectified is that the remote desktop only works when a monitor is already plugged in. I hope to eventually get the radiotelescope to work headlessly.

For simulations lab, I fitted the RNN model with one simulation run on a batch size of 16, learning rate of 0.001, 20 epochs, 64 RNN layers, and 3 output nodes(z,x,theta), and one input node(t). The final prediction was given for 0.2 seconds: t=0.2: x=1.204282283782959, z=1.5045793056488037, theta=39.42704391479492, while the actual data was: t = 0.20000100, x = 1.24138119 z = 1.55755632 theta = 39.40853114.

2/26/23:

In simulations: we have done research on RNN’s. Specifically LSTM. For radiotelescope, we realized that a monitor needs to be connected on boot to the arduino in order for the remote desktop to work. For pivoting the other radiotelescope, we have cut the acrylic.

12/14/23:

Lots have been done in simulations! I have created a set of thrice coupled second-order differential equation which are unsolvable. I am using a modified eulers method to solve it. I also configured another raspberry pi to work with GNU-Radio for radiotelescope.

11/10/23:

Sucess for the particle counter! I have mok one working. It is coded to work headless. There is one button and three led lights(yellow, red, and green) for interacting with the particle counter to gather the information needed. The data is downloaded to a .txt file, I want to change this to a .csv file.

11/6/23:

I built and coded a particle counter for the new particle counting lab. This code was written on an arduino, with an sd card extension that saves the data. It has yet to be tested. I also coded a plant growth algorithm for Hydroponics. This code works by splitting the webcam into swections, with each section being its own plant, and simply measuring the green in it. I did a complete redisign of the hand, using cables to pull on the fingers of the hand. I dislike this design as getting enough torque to hold something is difficult. I hope to redisgn.

9/20/23

Almost a month later an a lot has happened. Firstly on my prothetic hand, I got it to work!! It could count and play rock paper scissors. It no longer is using that silly PCA board. I also set up an ESP8266 module to work as an server, so that I can control the robotic hand using a webb app. Now, I am completely redoing the hand, and currently printing the new design. This new design allows the servos to sit further back on the had, which allows me to use larger, more powerful servos. I also coded up the Myoware sensors that can detect biolectricity flowing through your arm, and use that electricity to control the hand. I have also helped out the hydropnics projects, creating a mask so that background noise willl not be picked up by the camera when trying to analyze the plant. I am not using plantcv to analyze plant health on a raspberry pi. I am also doing research the anti-corrosive properties of zinc, and the applications in alternative, ocean based, sources of power. All my code is on github if needed to be referenced.

08/31/23

Week one of WISRD! I spent the first two days putting together any past information left behind from the last group that worked on this project and figuring out how I would move forward. Also, I surveyed for any obvious damage - there are a couple of parts of the thumb of the prosthetic that are broken and need replaced. Looking at the electronics, I made a couple of changes. Firstly, I rewired the PCA 9685 controller in how it connects to a power source. Then I flipped the connection of the servos that were attached because they were backward. Then I realized that the 9-volt battery was above the operating voltages of both the servos themselves(4.8V-6V) and the servo controller (2.3V-5.5V). The microcontroller was also having connectivity issues, I suspect that it was receiving voltage over its limit too, as when I replaced the microcontroller and powered it with a lower 5V, the connectivity issues seemed to go away.

Statement of interest, Spring 2023: When applying to Wildwood, one of the first things the admissions officer brought up was the WISRD institute. After looking at its journal, I noticed one thing that separated this program from everything that I had seen before - it reinforces the importance of the process. Rather than brushing over what may seem to the boring and repetitive methods portion of the journal, this Institute seems to embrace that, understanding that the outcomes are only legitimized by the validity of the process that created them. I hope to both learn the ins and outs of what it means to do research, and also deepen my understanding in the STEM field.

Melanoma, along with other skin cancers in America, especially in sunny states like California, is a big issue, with a projected 8,000 people going to die in 2023. Unlike many other forms of cancer, Melanoma, due to it being a skin condition, will be visible as a discoloration on one’s skin. Given the time and resources, I would want to train a diagnostic ai that would take a picture taken from a phone, and determine whether or not the mole is cancerous or not.