Ezri A.

3/25/24:

Poster session went well. I got no feedback but excited to keep doing my work.

4/16/24:

For the articulating radiotelescope, I cut several of the acrylic pieces needed to put it together. I also ran 100 simulations for the RNN, this gave me about 100 million datapoints. I trained the rnn and it was giving me <0.3% error from a quick eyeball test. I need to run further tests before conclusion. I also started looking through eclipse data from the radiotelescope that I brought to the deck.

3/18/24

For simulations, I set up a larger scale test for the RNN to approximate. This time, the model was given 10 simulations, with starting x and z velocities ranging from 2 - 11 meters per second. Each input of data now had four features: initial velocity of x, initial velocity of z, initial theta value, and time value. The the corresponding x, z and theta values for that time and initial velocities are the output. For a quick test, we used the LSTM RNN model to predict values for [time = 0.2, initial x velocity = 2, initial x velocity = 2, initial theta = 45]: x=0.34424400329589844, z=1.1936168670654297, theta=44.90970230102539. These numbers are fairly close to the actual values: t = 0.20000100, x = 0.34881893, z = 1.21259506, theta = 44.88457848. This was used using a batch size of 32, learning rate of 0.001, 20 epochs, and 64 RNN layers.

3/11/24:

For simulations, I set up new simulation that has ten different velocities this data will then be used as training data for the LSTM.

3/7/24:

For Simulations: I replaced the RNN layers in keras with LSTm layers. This was able to get the prediction for 0.2 seconds to Predicted values for t=0.2: x=1.2400949001312256, z=1.5510003566741943, theta=39.3616943359375.

For Radiotelescope: We tested horn and yagey. Teapot was too low.

3/4/23:

We got the radiotelescope remote desktop to work! We were able to read signal from the milkyway. One issues that still needs to be rectified is that the remote desktop only works when a monitor is already plugged in. I hope to eventually get the radiotelescope to work headlessly.

For simulations lab, I fitted the RNN model with one simulation run on a batch size of 16, learning rate of 0.001, 20 epochs, 64 RNN layers, and 3 output nodes(z,x,theta), and one input node(t). The final prediction was given for 0.2 seconds: t=0.2: x=1.204282283782959, z=1.5045793056488037, theta=39.42704391479492, while the actual data was: t = 0.20000100, x = 1.24138119 z = 1.55755632 theta = 39.40853114.

2/26/23:

In simulations: we have done research on RNN’s. Specifically LSTM. For radiotelescope, we realized that a monitor needs to be connected on boot to the arduino in order for the remote desktop to work. For pivoting the other radiotelescope, we have cut the acrylic.

12/14/23:

Lots have been done in simulations! I have created a set of thrice coupled second-order differential equation which are unsolvable. I am using a modified eulers method to solve it. I also configured another raspberry pi to work with GNU-Radio for radiotelescope.

11/10/23:

Sucess for the particle counter! I have mok one working. It is coded to work headless. There is one button and three led lights(yellow, red, and green) for interacting with the particle counter to gather the information needed. The data is downloaded to a .txt file, I want to change this to a .csv file.

11/6/23:

I built and coded a particle counter for the new particle counting lab. This code was written on an arduino, with an sd card extension that saves the data. It has yet to be tested. I also coded a plant growth algorithm for Hydroponics. This code works by splitting the webcam into swections, with each section being its own plant, and simply measuring the green in it. I did a complete redisign of the hand, using cables to pull on the fingers of the hand. I dislike this design as getting enough torque to hold something is difficult. I hope to redisgn.

9/20/23

Almost a month later an a lot has happened. Firstly on my prothetic hand, I got it to work!! It could count and play rock paper scissors. It no longer is using that silly PCA board. I also set up an ESP8266 module to work as an server, so that I can control the robotic hand using a webb app. Now, I am completely redoing the hand, and currently printing the new design. This new design allows the servos to sit further back on the had, which allows me to use larger, more powerful servos. I also coded up the Myoware sensors that can detect biolectricity flowing through your arm, and use that electricity to control the hand. I have also helped out the hydropnics projects, creating a mask so that background noise willl not be picked up by the camera when trying to analyze the plant. I am not using plantcv to analyze plant health on a raspberry pi. I am also doing research the anti-corrosive properties of zinc, and the applications in alternative, ocean based, sources of power. All my code is on github if needed to be referenced.

08/31/23

Week one of WISRD! I spent the first two days putting together any past information left behind from the last group that worked on this project and figuring out how I would move forward. Also, I surveyed for any obvious damage - there are a couple of parts of the thumb of the prosthetic that are broken and need replaced. Looking at the electronics, I made a couple of changes. Firstly, I rewired the PCA 9685 controller in how it connects to a power source. Then I flipped the connection of the servos that were attached because they were backward. Then I realized that the 9-volt battery was above the operating voltages of both the servos themselves(4.8V-6V) and the servo controller (2.3V-5.5V). The microcontroller was also having connectivity issues, I suspect that it was receiving voltage over its limit too, as when I replaced the microcontroller and powered it with a lower 5V, the connectivity issues seemed to go away.

Statement of interest, Spring 2023: When applying to Wildwood, one of the first things the admissions officer brought up was the WISRD institute. After looking at its journal, I noticed one thing that separated this program from everything that I had seen before - it reinforces the importance of the process. Rather than brushing over what may seem to the boring and repetitive methods portion of the journal, this Institute seems to embrace that, understanding that the outcomes are only legitimized by the validity of the process that created them. I hope to both learn the ins and outs of what it means to do research, and also deepen my understanding in the STEM field.

Melanoma, along with other skin cancers in America, especially in sunny states like California, is a big issue, with a projected 8,000 people going to die in 2023. Unlike many other forms of cancer, Melanoma, due to it being a skin condition, will be visible as a discoloration on one’s skin. Given the time and resources, I would want to train a diagnostic ai that would take a picture taken from a phone, and determine whether or not the mole is cancerous or not.