Developing a Remote Controlled Vehicle Utilizing Augmented Reality
1/28/18: As of today, we have functioning versions of each body part for our prototype. I have finished building and designing initial circuit for 12 servos. Since we could not get the shield to servo shield to function properly, I decided to use what I had learned about circuitry in my EclipseMob project to design my own circuit on a breadboard. I then 3D printed an internal skeleton for the spider that would securely hold the Arduino Uno, breadboard and battery. However, all of this work is only for trial runs. While Blake programs and tests our initial designs, I will continue to work on creating a more efficient circuit that is capable of receiving instructions from another Arduino Uno that connects to the computer. The main challenge ahead is turning our current circuits into wireless ones because Arduinos require a cable connection to a computer. I intend to try bypassing this by using a Bluetooth Shield without using the Arduino Uno, instead connecting it directly to both power and the servo breadboard. The picture below shows the initial test circuit, containing the printed skeleton, the Arduino Uno, and the completed servo breadboard (without servos attached).
1/16/18: Our first print of the Spider-bot’s body was unsuccessful. We printed it hollow, and as a result, it warped and had a straw-like texture, rather than being solid and smooth. The settings we used for the second print proved successful, the results of which are in the picture below. We printed at normal quality/speed on the Ultimaker, customizing a few settings in order to print faster yet with desirable quality. The custom settings we used were a brim of 8mm, an infill density of 10%, and a layer height of 0.12. We additionally printed out a leg and assembled it. With the exception of the structural integrity of both the top of the knee tendon and the hip joint, the legs are ready to be attached to their servos, and subsequently the body. The picture below shows the final body of the spider.
12/5/17: We have begun printing out all of the parts for our spider-bot: the body, upper leg w/ hip joint, lower leg w/ knee tendon, and nuts and bolts for securing joint rotation. We used Tinker-CAD to create our designs, and included housings for the the servos, Arduino stacks, and circuits we intend to use. For our first prototype, we plan to get the servos to work first separately from the body, then attach the body parts together with the servos. First, we see if the body stands upright, which it should considering that it is symmetrical and that we are using servos (which lock into positions) rather than motors. Second, provided that it stands, we will attempt have the whole spider execute some basic motions, primarily taking tiny steps forward. If we fail, we have to evaluate if it is an issue of the spider’s frame, its circuitry, or its coding. Doing a preliminary servo test separate from the body should eliminate circuitry issues and most coding issues, but we expect to have errors or inaccuracies integrating them with the body.
11/17/17: This past week, Blake and I designed the complete circuit of our spider, and put together a parts list to order when we get back from Thanksgiving break. We will use a Bluetooth Arduino shield on each of two Arduino Unos to communicate between the computer and the spider. The Arduino Uno on the spider will use battery power, most likely a rechargeable one for ease of use, and the Arduino Uno we use to transmit will be plugged into the computer. We chose this design for two reasons. The first is because we can create an intuitive interface for controlling the spider that will, eventually, integrate a camera for augmented reality. The second is because we can change the code while the spider is still on, allowing us to implement and test changes quickly and efficiently.
11/8/17: After settling on body designs, leg quantities, and sizes, we began to consider how to actually put the spider in motion. After a week of testing motors and circuits, we realized we had not done nearly enough research. Blake and Ben set out to find what is typically used in small robots that beginners to intermediates can make, and found that Arduino was still, as it was for the plane, a great system to use. After seeing that we were doing redundant research, Joe directed us to sit down and learn from Cole and Grace, who were already working with Arduinos, motor shields, and even servos. When they showed me their servos in action, I knew that that is what we would need to use in our design. I asked for a brief explanation, and Arduino made their integration very intuitive, and then set out to find a servo shield (because the motor shield they were using only had 2 ports for servos). This is when Blake and I discovered a 16 port servo shield that attached directly to an Arduino, which was a perfect fit for the 12 servos that we would need in our design.
10/27/17: We have begun conceptualizing and drawing out preliminary designs of our spider-bot, determining which materials to build with, and figuring out what is feasible considering our low level of experience with robotics.
10/13/17: After cycling through various ideas (a car, a boat, a rover, etc.), we finally decided upon creating a walking “spider” robot. The core ideas that we aimed to implement were the following. First, in thinking about a car, we realized that the AR component could feed much more detailed information to the driver than could a plane or helicopter. This was simply because of the increased volume inside a car relative to the movement it could perform. It would allow us to add more electronics and sensors, which could in turn feed to the user interface. Second, we determined that it would be of much more interest in terms of research to do a rover than a car. Though they are similar in concept, the rover would have a wider (but slower) range of movement allowing for better usage of the sensors. However, even the rover felt like too limiting an idea, and given its simplicity, we were determined to challenge ourselves a little further. Third, a “spider” vehicle has a better range of movement than all of the vehicle concepts with the exception of the helicopter. The key difference between the spider and the rover are that the spider uses 4 legs that can each be lifted and placed on higher surfaces, allowing the spider to both crawl more difficult terrains and climb steps (of a certain height). Additionally, it did not seem much more difficult to implement the AR component on the spider than the rover, which gave the spider its clear advantages.
10/6/17: After two weeks of working we again hit a block. We had created seemingly successful designs for the body, circuitry, and engines of the plane, but we ran into an issue. First was creating steering. It is standard in an airplane to use ailerons as the primary steering mechanism, but on a model plane this became increasingly difficult to implement as we began to design wing foils. An aileron is simply the tab on the back of a plane wing that rotates up or down to increase the wing’s resistance to air. This increase or decrease in resistance (based on the direction that it rotates in) is what allows for the plane to turn, and increase or decrease its angle of attack. Though we designed a system where each wing had a shaft attached to a motor located inside the fuselage of the plane, which minimized complications, the design of the wing foil itself was too difficult to create. Rather, it was not worth the time that it would require to make a highly functional and optimal design. Instead, my group intended to make the project about the implementation of AR into RC vehicles. Though we found the physics, mathematics, and engineering behind designing the wings both interesting and gratifying, it strayed (and took away from) the original intent of the project. Before investing too much time in another risky endeavor, we decided to call the plane concept a failed idea for a vehicle, and learn from this failure. Going forward, we refined our sense of our actual goal in order to achieve it with more efficiency and greater possibility for success. One step backward, two steps forward.
9/22/17: We began to realize that a helicopter specifically posed many challenges we were not ready to overcome. Though with sufficient effort it could have been done, the task was too daunting and we had already begun to think about how we could integrate the same concept for a more feasible vehicle. We decided on the concept of a plane, but in hindsight this was not much simpler.
9/8/17: I decided to take leadership in the project because I have general knowledge and organizational skills that were useful in facilitating collaboration. At this point, I assigned a few roles. Blake took lead of developing coding for the motors, gyroscope, and receiver/transmitter system, as well as finding the Arduino board most efficient for our usage. Felix took lead of finding parts and designing a frame for the body of the helicopter using Tinker-CAD and Google SketchUp, and Ben took lead of testing properties of the motors that WISRD already had (ones not specialized for the project).
8/28/17: After completing the experiment for the EclipseMob project and beginning the school year, I decided to work on a project with Blake, Ben, and Felix. Joe had originally suggested that we work on a helicopter simulation using hydraulics, so we began to look into what that would entail, along with what materials were already at our disposal. The golf cart was of particular interest, as we considered using the battery and motor to create a more powerful life-size simulation. However, we realized that we shared more interest in bridging remote controlled vehicles with simulation technology. Out of long debates was born the first project concept: an RC helicopter where the pilot uses augmented reality to assist in flight.
EclipseMob: Radio Wave Propagation During an Eclipse
8/22/2017: I recorded the final data run and uploaded it. I began recording at 10:15 am and ended at 10:35 am. This data run was to confirm the baseline frequencies that the receiver recorded on 8/20.
8/21/2017: I recorded data during the eclipse, including the peak of the eclipse. The following are all specific to the location of recording (Wildwood School): The partial eclipse began at 9:05:29.8 am, I began recording data at 9:45 am, the peak of the partial eclipse occurred at 10:20:49.8 am, I finished recording at 11:00 am, and the partial eclipse ended at 11:44:19.2 am. The signal was not found, and I uploaded the data.
8/20/2017: I set up the receiver on the deck of Wildwood School for a test run in order to establish the baseline frequencies that the are received relative to time of day and location. I failed to record at the exact same time that the peak of the partial eclipse would occur during the following day, and as such it is a less accurate baseline. However, I did successfully record within the total duration of the partial eclipse. I began recording at 11:30 am and finished recording at 11:55 am, whereas the partial eclipse would end at 11:44:19.2 am. The signal was not found, and I uploaded the data as a .wav file to EclipseMob’s server.
8/14/2017: I installed the 4 9V batteries required to power the circuit and antenna, and connected all 3 units (batteries, receiver, and antenna). At first test using the voltmeter, the voltage output by the circuit was too high to be plugged into a smartphone without causing damage, as it was outputting 11.57V. While looking through the circuit diagram, I found that I had plugged the wire from the V port on the TRRS terminal into the positive bus rather than the negative. Luckily, this was the only mistake, as it was easily noticeable and the final step in assembling the circuit. Upon moving the wire to the negative bus, the voltage output by the circuit was 0.24V, and the amperage was 0.935mA, both of which are safe to input into a smartphone. After downloading the EclipseMob application from the Google Play Store, I tested the application with the assistance of J. A. Wise, whose phone was and will be used for data recording and reporting, and a 60kHz signal was not recieved. The maximum frequency displayed on the graph over the interval of test recording was just above 20kHz (the exact value was not displayed on the graph). This could indicate a range of things, one such being that the 60kHz signal is not yet being broadcast. In order to investigate the possible cause, and if the signal is being broadcast yet, we inquired about it on the forum.
8/9/2017: I removed the 24AWG magnet wire, which damaged a few of the cord clips during the removal process. I replaced those clips, before wrapping both sets of 100 coils of antenna (total 200: 100 upper, 100 lower).
8/7/2017: I attached all 8 cord clips to the antenna box. I then wrapped the upper 100 coils (out of the total 200) of the antenna, before realizing that I had used the wrong gauge of magnet wire. I had used 24AWG magnet wire, while the instructions called for the use of 28AWG magnet wire.
7/20/2017: I built the circuit component of the receiver from the instructions found on the EclipseMob website. This excluded the antenna and its connection to the circuit; the battery unit and its connection to both the circuit and the antenna; and the TRRS terminal, which connects to a smartphone. The instructions on the website caused some confusion and inconvenience to the process, as the instructions for the second half of the circuit were oriented opposite that of the first half. This would have provided no inconvenience, except that the charge of the buses is not symmetrical, and that the numbering of the rows of the breadboard are directional. Both factors caused me to need to reorient the instructions of the second half of the circuit in order to accurately cooperate with those of the other half, without causing overlap or having faults in the polarity of power output to certain areas of the circuit. Editor’s note: Since the date I assembled this component (7/20/2017), the instructions have been updated to have much more clarity and continuity.
7/14/2017: I am presenting the EclipseMob experiment to the Santa Monica Astronomy Club. EclipseMob Presentation
6/28/2017: I began researching the purpose of EclipseMob, as well as the procedure for building the radio receiver and detecting the effects that the solar eclipse will have on the ionosphere and, subsequently, radio waves. EclipseMob’s experiment broadcasts radio waves from 2 locations, the WWVB radio station in Colorado and the Navy transmitter in California, at a frequency of 60kHz. The project uses crowd-sourcing to gather data. As such, participants build receivers that feed data to a smartphone, which then allows for the data to be reported to EclipseMob. The data to be reported includes the range of frequencies that are received and the GPS location of the smartphone, both being plotted constantly over the interval of recording.
Abstract: Radio waves are an electromagnetic wave that constitute low frequencies (LF) or very low frequencies (VLF) on the spectrum. They are used often for communication over long distances, and are vital in the case of emergencies when landline communications and the infrastructure surrounding other types of communications are damaged. As a result, it is important to study the effects that natural phenomena have on radio wave propagation, and such is the aim of the EclipseMob experiment. EclipseMob is a crowd-sourced project hosted by The Geological Society of America, The National Association of Geoscience Teachers, the University of Massachusetts Boston, and George Mason University, and funded by the National Science Foundation. The research teams at the universities listed above utilize data collected from across the United States in order to create a map of the range of frequencies that participants received in their geographic location over the duration of the eclipse. The data reflected the reception of a specific frequency of 60 kHz, which is constantly broadcasted by the National Institute for Standards and Technology from the WWVB radio station in Colorado in order to synchronize clocks in the US. Participants built radio receivers from instructions provided by EclipseMob that connected to a smartphone, and the data collected by the receiver was first uploaded to an application on that smartphone, then processed to determine whether the 60 kHz frequency had been received, and final sent to the EclipseMob research team regardless of whether that signal had been found. This data ultimately can aid in understanding changes in the ionosphere during certain phenomena (in this case a solar eclipse), and how these changes can effect the forms of communication we rely on most during emergency situations.
EclipseMob is not nearly the first experiment of its kind, but it is the largest crowd-sourcing manifestation of the radio wave propagation during an eclipse experiments. The first experiment of this kind was conducted on April 17, 1912 by William Henry Eccles. Eccles was considered a pioneer of radio communications, and was involved in the study of the effects of the ionosphere on this mode of communication that he pioneered. He was an observer of the phenomenon that radio wave propagation was had higher efficacy during the night as opposed to during the day, which he theorized was a result of changes in the ionosphere due to solar radiation. He and a few other researchers conducted experiments during the 1912 solar eclipse over Europe in England, France, and Denmark, in which various radio frequencies were transmitted over long distances, and the strength of their reception was measured with the aim of finding correlation to the eclipse. These experiments had the potential to support Eccles’s earlier observation, as the eclipse provided a shadowed (and potentially deionizing) region adjacent to a normative region (with an ion density that was standard for that time of day). However, the research done was inconclusive, and so other experiments, including EclipseMob, have since aimed at providing greater understanding of how the solar eclipse affects the ionosphere.
Radio Communication and the Ionosphere:
Radio communication utilizes two different forms of transmission based upon the distance that the wave must travel. For short distance communication, ground waves are sufficient. Ground waves are as they sound: a radio wave propagated at a near-ground level. This form of communication is highly effective as it encounters less interference than the alternative sky waves. Since ground waves cannot bend in accordance with the surface of the earth, ground wave communication is limited to certain distances. When longer distance communication is necessary, radio waves are instead propagated at a calculated angle towards the sky. These waves generally reflect off of the ionosphere to be received in another location. This method allows radio waves to travel a much farther distance, but there is a disadvantage. The ionosphere is split into D, E, F¹, and F² layers, where the D layer has the lowest altitude and therefore highest gas density, and the F² layer has the highest altitude and therefore lowest gas density. The ionosphere, as its name suggests, is the layer of the atmosphere in which gas ionizes through ultraviolet radiation from the sun. As such, ion density is highest during the day and lowest during the night. However, each layer deionizes at a different rate, where the rate of deionization is directly proportional to density, and therefore inversely proportional to the altitude of the layer. This is because at altitudes of higher density, there are more ions and subsequently more possibility for collisions, which are the cause of deionization. In the E and F layers, the density is low enough such that these collisions do not happen as often as in the D layer, meaning that these two layers remain ionized long after radiation has stopped being applied. In fact, they remain ionized throughout the entire night, whereas the D layer loses its ionization during the night. This property of the ionosphere is what allows for radio communication via sky waves. At the ion density of the E and F layers during both day and night, LF and VLF radio waves reflect off of the ions, propagating back down to the surface of the earth. However, the properties of the D layer provide some difficulties. During the day, the ion density is too high for some LF, and especially VLF, radio waves to pass through to then be reflected off of the E or F layers. Instead, the waves are absorbed by the ions. During the night, the ion density is low enough for LF and VLF radio waves to pass through it undisturbed, and reach the reflective E and F layers.