Monday, December 3, 2018

Understanding Key UAS Fundamentals through Applied Research

As the semester winds down, our class has been tasked with a group capstone project. Our class met two times to discuss our capstone project and our roles in it. We all first asked ourselves what we would be interested in for a capstone project. To begin, we all wanted defined roles that we each were specifically interested in. We went around to describe what each of us wanted to get from this project and what each of us were interested in. We described what was essential to a UAS operation that would gather data and eventually deliverable data. We knew that there was going to be managers that would oversee the whole operation and be able to communicate with Dr. Hupy about how the project is going. Next there was going to be people in charge of data, flights, writing, ground control, system integration, payload integration, operation management, and flight engineering. All of these tasks were pointed out as essential and that someone was going to be in charge of it fully. They would have it as a permanent job that would need to be carried out to ensure success. Within these jobs, I chose to take flight engineer. For my task, I will be in charge of ensuring all members are properly trained to operate the platforms along with the sensors aboard. This is crucial since Indiana weather is generally unpredictable and can leave the group members without flying for extended periods of time. Ensuring that each group that flies is properly trained on each platform will reduce any chance of mishaps.
Although each group member has permanent roles, everyone has rotating roles that will ensure everyone gets to participate in the projects many jobs. These will mostly be assistant jobs, meaning that you will help someone with their permanent role. This will allow everyone to get a feel of how the other jobs are done. Our group will also be broken up into teams, or flight crews. They will be permanent and will be chosen depending on everyone's schedule for the next semester.

Tuesday, November 27, 2018

Video GeoTagger

Video GeoTagger is a free online software used to create geolocated videos. What this means is that someone can overlay a map, GPS points, and their video to give a user the ability to see where exactly the video was taken. This is particularly useful when the data is being analyzed by someone who might not have been at the sight where the video was taken. For our use, we have a drone taking a video and we were tasked with geotagging it. Below is the first video that was completed using the  free geotagger software. The interface is simple and easy to work with. You simply load the video into the software then load the GPS points onto the map. You start the video and tag the video at the start, then watch the entire video then tag it at the end. Making sure that the points match up exactly with the time shown on the video. 

This was done once more with another video, but exact same process.


GeoTagged videos can give the viewer much more information on the flight, which is why it is used a lot for videos that put out data. It can show where exactly the video was being taken so that someone can see where it was taken. It gives perspective to the viewer so that they can analyze data accurately. While these videos are useful in a lot of circumstances, they require ground control points or referenced locations to be most useful. Anywhere that doesn't have a noticeable feature, such as a tree, man made structure, or river would be hard to use this software. The software that we used was free, so it does not have the same accuracy that a survey grade software would have.    

Thursday, November 15, 2018

Impact of altitude on high-resolution multi-spectral remote sensing for hardwood forest species delineation


               The research project chosen is looking into the impact of altitude on high-resolution multi-spectral remote sensing for hardwood forest species delineation. We are looking to see if there is a considerable difference in the collection of data with a manned aircraft versus an unmanned aircraft. Altitude is the biggest factor between the two, the unmanned system will fly lower while the manned will fly at a greater altitude.  The purpose of an annotated bibliography is to let the reader or viewer get an insight on what has been done that concerns our research. An annotated bibliography consists of articles, journals, research papers, and books that have been published (preferably peer-reviewed). A research timeline is also an important aspect of this research. A research timeline lets groups see where they should be in terms of the process of the research. This can be done in a lot of ways but the most common one is a Gantt chart.

(Adão et al., 2017)
Adão, T., Hruška, J., Pádua, L., Bessa, J., Peres, E., Morais, R., & Sousa, J. J. (2017). Hyperspectral Imaging: A Review on UAV-Based Sensors, Data Processing and Applications for Agriculture and Forestry. Remote Sensing, 9(11), 1-30. doi:10.3390/rs9111110
This article focuses on the use of hyperspectral sensors in UAS use. It addresses the advantages of hyperspectral sensors over the use of RGB or NIR cameras for data acquisition for agriculture and forestry. This is a great source of information for this because it highlights drone usage and checklists for those platforms and how to effectively collect that data. The only setback from this article is that it doesn’t talk too in-depth about the use of this for certain scenarios in forest and agriculture.
(Bonnet, Lisein, & Lejeune, 2017)
Bonnet, S., Lisein, J., & Lejeune, P. (2017). Comparison of UAS photogrammetric products for tree detection and characterization of coniferous stands. International Journal of Remote Sensing, 38(19), 5310-5337. doi:10.1080/01431161.2017.1338839
This article goes into depth about the use of UAS to detect trees and asses forest attributes for coniferous trees. More specifically the age of the trees and the surrounding age of the trees. While this article is great for potentially seeing how they did it, it doesn’t give us a great idea of what to expect for ours. The group could potentially get some ideas from this paper on how to go about doing our research. It also does not compare it to manned aircraft.
(Gabrlik, la Cour-Harbo, Kalvodova, Zalud, & Janata, 2018)
Gabrlik, P., la Cour-Harbo, A., Kalvodova, P., Zalud, L., & Janata, P. (2018). Calibration and accuracy assessment in a direct georeferencing system for UAS photogrammetry. International Journal of Remote Sensing, 39(15/16), 4931-4959. doi:10.1080/01431161.2018.1434331
In this article, the studying being done is to see how well they can calibrate and function a custom built multi-sensor for direct georeferencing. This would enable them to get centimeter level accuracy for mapping an area. This article shows a lot about RTK, GNS, and INS which could be used for our project to get the data most accurately. The article has good information on these but for our use the article does not give much about the actual data being collected.
(Getzin, Nuske, & Wiegand, 2014)
Getzin, S., Nuske, R. S., & Wiegand, K. (2014). Using Unmanned Aerial Vehicles (UAV) to Quantify Spatial Gap Patterns in Forests. Remote Sensing, 6(8), 6988-7004. doi:10.3390/rs6086988
This article is looking into gap distribution of forests being reflected by the impact of man-made tree harvesting or whether it is naturally occurring patterns of tree death. It goes into more causes that could be having this effect on the forest, they use UAV because of the small gaps between the trees. Those cannot be measured accurately with manned planes. This article is great because it shows how they utilized a drone for this reason and how it worked. It goes into more detail than we know of, but it is still good to get a knowledgeable base one. This article does not go into detail on how much more accurate this data is than if it was a regular manned plane.
(Lisein, Michez, Claessens, & Lejeune, 2015)
Lisein, J., Michez, A., Claessens, H., & Lejeune, P. (2015). Discrimination of Deciduous Tree Species from Time Series of Unmanned Aerial System Imagery. PLoS ONE, 10(11), 1-20. doi:10.1371/journal.pone.0141006
In this article it addresses how and when UAS should be used to efficiently discriminate deciduous tree species. In goes into detail that they are trying to find the best way to achieve the optimal species discrimination and when. They state when they start and when they end, then classify the data to discriminate tree species. This is a great article for our group since this is very close to what we are doing. Instead of doing deciduous tree species we are doing general tree species and if there is a difference between manned and unmanned. Very good article for our group.
(Manfreda et al. 2018)
Manfreda, S., M. E. McCabe, P. E. Miller, R. Lucas, V. P. Madrigal, G. Mallinis, E. Dor, D. Helman, L. Estes, G. Ciraolo, J. Mullerova, F. Tauro, M. I. de Lima, Jlmp del Lima, A. Maltese, F. Frances, K. Caylor, M. Kohv, M. Perks, G. Ruiz-Perez, Z. Su, G. Vico, and B. Toth. 2018. "On the Use of Unmanned Aerial Systems for Environmental Monitoring." Remote Sensing 10 (4). doi: 10.3390/rs10040641

This study goes into the use of UAV in environmental monitoring. It discusses how this is done mostly by ground and satellites but now it can be done easier and much more efficient with UAVs. It talks about how ground and satellites have certain constraints that limit them while UAV have little that limits them.  The papers aim is to provide an overview of the existing research of UAS in these fields.
(Pádua et al., 2018)
Pádua, L., Hruška, J., Bessa, J., Adão, T., Martins, L. M., Gonçalves, J. A., . . . Sousa, J. J. (2018). Multi-Temporal Analysis of Forestry and Coastal Environments Using UASs. Remote Sensing, 10(1), 1-N.PAG. doi:10.3390/rs10010024
This papers topic looks at the advantages and challenges related to UAVs for imagery and data collection in forestry and costal environments. It states that two case studies are done one focusing on chestnut tree health and the second on the sandpit of Cabedelo in different time periods. This paper has some good points mostly talking about how UAS and sensors have really taken off. Though, it does not touch base with out project too much, other than techniques used for the chesnut tree health we can only learn from how they managed to carry out how they did it.
(Puliti, Ørka, Gobakken, & Næsset, 2015)
Puliti, S., Ørka, H. O., Gobakken, T., & Næsset, E. (2015). Inventory of Small Forest Areas Using an Unmanned Aerial System. Remote Sensing, 7(8), 9632-9654. doi:10.3390/rs70809632

This article mostly looks at the use of UAVs to inventory small forests. This study uses 3D variables from UAV imagery with ground reference data to create linear models for mean height, dominant height, stem number, basal area, and stem volume. The data surrounding this topic before this study (said in the article) was said to be inconsistent and unreliable. This is a good article since we are looking at forests and categorizing them, but it doesn’t talk about species. It only touches on size and actual tree dimensions.
(Singh and Frazier 2018)
Singh, K. K., and A. E. Frazier. 2018. "A meta-analysis and review of unmanned aircraft system (UAS) imagery for terrestrial applications." International Journal of Remote Sensing 39 (15-16):5078-5098. doi: 10.1080/01431161.2017.1420941.
This article performed a search using UAS related keywords to identify peer reviewed studies. They then filtered the results to a couple of keywords. After that, they selected a subset then deeply analyzed each study. They found that UAS practices need better standardization of methods and procedures for UAS data collection and practices.
(Wieser et al., 2017)
Wieser, M., Mandlburger, G., Hollaus, M., Otepka, J., Glira, P., & Pfeifer, N. (2017). A Case Study of UAS Borne Laser Scanning for Measurement of Tree Stem Diameter. Remote Sensing, 9(11), 1-11. doi:10.3390/rs9111154
This study looks at diameter breast height of trees in forestry. They use laser scanners that create high resolution point clouds onboard UAS. The diameter breast height is estimated from a UAS point cloud. This study is good for us for methods and techniques, but the data is not too useful, they are using sensors to measure actual dimensions while we use sensors to classify data.


Timeline



Above is a Gantt chart, it is a method of tracking progress. It is used widely in many industries to keep track of the timeline that they need to follow, especially in the aviation industry. This Gantt chart was made for this study. It starts in the beginning of the school year and goes until the end of the spring semester. In the beginning, research on similar case studies to see if it has either already been done or methods which people used is looked at. It gives an outlook on how the project might go, or if it is a question that needs answered. During this time data collection is done, because they can be done independent of each other. Since the project has a manned aircraft that needs an integrated sensor, it is also put in under integration of sensors. The second half of the integration is also taking the sensor off the manned aircraft, since it could take a little longer than simply taking a couple of bolts out. After data collection begins, data analysis can also happen, overlapping with other tasks. As data analysis is going on, it runs into conclusion of the study and the paper. As the analysis takes place it will be noted. 











Adão, T., Hruška, J., Pádua, L., Bessa, J., Peres, E., Morais, R., & Sousa, J. J. (2017). Hyperspectral Imaging: A Review on UAV-Based Sensors, Data Processing and Applications for Agriculture and Forestry. Remote Sensing, 9(11), 1-30. doi:10.3390/rs9111110
Bonnet, S., Lisein, J., & Lejeune, P. (2017). Comparison of UAS photogrammetric products for tree detection and characterization of coniferous stands. International Journal of Remote Sensing, 38(19), 5310-5337. doi:10.1080/01431161.2017.1338839
Gabrlik, P., la Cour-Harbo, A., Kalvodova, P., Zalud, L., & Janata, P. (2018). Calibration and accuracy assessment in a direct georeferencing system for UAS photogrammetry. International Journal of Remote Sensing, 39(15/16), 4931-4959. doi:10.1080/01431161.2018.1434331
Getzin, S., Nuske, R. S., & Wiegand, K. (2014). Using Unmanned Aerial Vehicles (UAV) to Quantify Spatial Gap Patterns in Forests. Remote Sensing, 6(8), 6988-7004. doi:10.3390/rs6086988
Lisein, J., Michez, A., Claessens, H., & Lejeune, P. (2015). Discrimination of Deciduous Tree Species from Time Series of Unmanned Aerial System Imagery. PLoS ONE, 10(11), 1-20. doi:10.1371/journal.pone.0141006
Manfreda, S., M. E. McCabe, P. E. Miller, R. Lucas, V. P. Madrigal, G. Mallinis, E. Dor, D. Helman, L. Estes, G. Ciraolo, J. Mullerova, F. Tauro, M. I. de Lima, Jlmp del Lima, A. Maltese, F. Frances, K. Caylor, M. Kohv, M. Perks, G. Ruiz-Perez, Z. Su, G. Vico, and B. Toth. 2018. "On the Use of Unmanned Aerial Systems for Environmental Monitoring." Remote Sensing 10 (4). doi: 10.3390/rs10040641
Puliti, S., Ørka, H. O., Gobakken, T., & Næsset, E. (2015). Inventory of Small Forest Areas Using an Unmanned Aerial System. Remote Sensing, 7(8), 9632-9654. doi:10.3390/rs70809632
Pádua, L., Hruška, J., Bessa, J., Adão, T., Martins, L. M., Gonçalves, J. A., . . . Sousa, J. J. (2018). Multi-Temporal Analysis of Forestry and Coastal Environments Using UASs. Remote Sensing, 10(1), 1-N.PAG. doi:10.3390/rs10010024
Singh, K. K., and A. E. Frazier. 2018. "A meta-analysis and review of unmanned aircraft system (UAS) imagery for terrestrial applications." International Journal of Remote Sensing 39 (15-16):5078-5098. doi: 10.1080/01431161.2017.1420941.

Wieser, M., Mandlburger, G., Hollaus, M., Otepka, J., Glira, P., & Pfeifer, N. (2017). A Case Study of UAS Borne Laser Scanning for Measurement of Tree Stem Diameter. Remote Sensing, 9(11), 1-11. doi:10.3390/rs9111154

Sunday, November 11, 2018

Field Outing at Dr. Hupy's

For our lab, our class was to take video and pictures of Dr. Hupy's property. This data is going to be used for analyzing later on. The platform of use was a Yuneec H520, an orange hexacopter. While there, the ground station made a flight plan and started the flight. Before this, our class was taken around his property setting down ground control points (GCP). These are marked points on the ground that have a known geographic location. If used, GCPs must be visible in these aerial photos, so we had to place them in locations that would not be blocked by trees or other obstacles. We placed a total of 9 of these points variously around his property, remembering where they were put. For these GCPs, once done, the team is pick up the pads in the opposite direction that they placed them. For instance if you were to number them 1-9 and placed them down in sequential order, you would pick them up from 9 to 1. Once set down the drone was lifted off and started its mission. It was a success with not mishaps or incidents.

GIS Day

Main Events:

10:00 –11:00 am
Keynote: GIS for natural resources management at United Nation (STEW 206) Dr. Nicolas Picard, Food and Agriculture Organization of the United Nations

11:00am–12:20pm
Presentations (STEW 206) Spatial Humanities: What is and What Can it Be. Prof. Sorin Adam Matei, Associate Dean of Research and Graduate Education, College of Liberal Arts Race and Spatial Humanities Prof. Kim Gallon, Assistant Professor of History Forest structural diversity as a predictor of ecosystem function in North America Dr. Elizabeth LaRue, Forestry and Natural Resources My laptop takes forever, now what! Eric Adams, ITaP Research Computing

GIS day was a day to learn more about GIS and how it fits with UAS operations. Above is the main schedule that was posted (although there were more events afterwards). Although I could not make it due to illness, I was able to talk to classmates that did go to get some information about what happened. The main speaker was Dr. Picard, who was affiliated with the Food and Agriculture Organization of the United States. He talked about GIS data and how to analyze it to its full potential and also talked about mistakes other people made when trying to put out good data. The next presenter was Professor Sorin Matei, he had also talked about research projects that utilized drones. After this Professor Kim Gallon, who focused on the ethics of GIS. Professor Gallon said that people were simply looking at the numbers that we were obtaining from data capture and seeing them as just that, numbers. Similarly to comparing humans to numbers or figures on a screen or graph. Even though I could not attend, I still asked classmates for information about the talks since it did seem interesting for us in the UAS major.
In other classes I have taken, the ethical issues of treating people like a number or figure has been brought up, mainly in my remote sensing class. In my opinion it does dumb down what is actually occurring, regardless of it is serious or not. But, that does not solve much in my opinion, there isn't much people can do that will put a lasting effect on those statistics. 
For our class, we have less than 10 people but our outlooks for our career vary throughout us. Although this depended on what students want to do with this major, this major can branch out in many ways so students may want to use it for GIS applications and others may want to use it for design and manufacture. Personally, I am not interested in the data side of drones but the design and manufacturing of them. This still helps if I were to be interested in making a specialized drone for a mission that required sensors that need to be outfitted professionally. Having a team of well rounded individuals that have extensive knowledge on one area of topic can help team move forward easily. If an entire team only consisted of people with knowledge in data analytics, then they will encounter difficulty in the field where they might not know much about platform usability and troubleshooting. And similarly the other way around.
All together, GIS is a very vital market for drones. Drones are going to be used for these applications more and more. The technology is evolving at an ever increasing rate and will improve how these machines can do their job, making our job even easier.

Tuesday, October 30, 2018

Field Outings

For our outing, we were to map a section of McCormick Woods. Dr. Hupy, Ryan Ferguson and Evan Hockridge were ground station, while the rest of the class were assigned as visual observers. For this mission our class was to gather data on the different colors and species of trees. The main focus was to see if it is plausible to classify trees based on their colors, during the fall season. The Yuneec platform was used to gather data at 62 meters. This is an ongoing process and data is gathered every Tuesday. Since the forest is large enough to lose visual sight of the aircraft, we split up into three groups. Two observer groups and flight control group were the assigned groups. Each group was assigned a radio, when the the group had visual contact they would report it to flight control. When contact was lost it would also be reported. The three groups would rotate around the forest after each flight to get all the required data of the forest. Communication was brief but professional to ensure that line of sight was constant.

Thursday, October 25, 2018

QGIS Introduction

GIS is an acronym used for geographic information system. GIS is not just software, it involves the entire process of presenting geographical data. From how one captures it to processing it, storing it and presenting it involves GIS. Dr. Hupy stressed that GIS is not a software, but a group of people working together to process this data to a usable, readable form of data.

Proprietary GIS software generally involves a company that owns the rights and the intellectual property behind the software. The company within has its own employees that code and work the program so that people can use it, at a cost. This software is not free and generally very expensive. Open source GIS software is free and runs on the users of the program to make it better. The programmers who use open source can embed their own code to a version that can be made specifically for them, or their use. No one owns the foundational code, so anyone who wants to use it, is free to do so. But they cannot make their own modifications to see or copyright it, since it is owned by the users. Any modifications are freely made available to any user that might be interested in using it. This is one of the ways that open source GIS software is upgraded. Users make small or big improvements then it is sent to the users to be analyzed. If the upgrade is deemed useful to the program as a whole it can then be implemented into the core software.

Open source GIS software is particularly useful for UAS industry because the user can code their own form of software tailored to their specific need. They can also work with groups of people the ensure that the software will present the best form of data to the end consumer. If a company that processes data for farmers and crops wants a map, it will be drastically different than a company needing a map for mining operations. The software will be directed towards which map they need. In proprietary software if a user needed a certain tool that didn't previously exist, they would need to make a technical ticket and describe what they need. After that, if the company that has the software decides its useful, they will then have their software engineers create that tool. The user then has to buy the tool from the company, potentially costing a lot of money.

In QGIS, we were to make a hillshade map from data previously gathered. We took the color image and created a DSM, or digital surface model. From there, we did an analysis of the DSM to create a hillshade of the map. If we wanted, we could choose from different color presets available. Each color representing a change in elevation. The previous lab required us to use a different software called ArcMap, although I could not due to me being out of the state for an aviation related event.

Creating the false color IR image of the farm was another task. First, the color image was split up into different colors, red, blue, and green. Then the layers were stacked on top and assigned a different color. At first, setting the bands to different colors was difficult. The correct sequence was needed to be set before the image would display a false color IR image. After putting in different sequences the image was eventually displaying a false color IR image of the farm.

GIS software and data is very useful for the UAS industry. Most of the industry does most of its work with high tech sensors and displaying data. Farmers, miners, construction, and government projects all rely on software processing to put out useful data to solve a problem. This software is going to be prove very useful for the capstone project of tree species delineation. Asking whether or not a UAS can perform this task better than a manned plane from varying altitudes.


Tuesday, October 9, 2018

Pathfinders Trip

On October 4th-7th I was invited to attend the Pathfinders Gala award ceremony in Seattle, Washington. Dr. Kozak was leading the trip and wanted me to represent Purdue in both aeronautical engineering technology and unmanned aerial systems. The gala was not until Saturday but there were before hand events, introducing us to some of the people who had the pathfinders gala possible.
     On the first day, we flew out and had free time since there was nothing planned for Thursday. Me and my group got to site see Seattle and indulge ourselves in the different culture there. The next day was going to be full of events. In the morning, we had a personalized tour of the Boeing facility. This part was my favorite part, we were able to be on the ground right next to where Boeing assembled their 787s, 777s, 747s, and other jets. The first plane we were able to see was a 787 Dreamliner being assembled. The first stage was a tail end being assembled and being readied for the next stage. We were able to walk inside the tail end and look around to see how complex these airplanes truly are. It was an amazing experience. The next couple of stages we were not able to see around inside due to the planes being tested and worked on. The last stage was an almost fully built 787, the only aspect of this plane that wasn't being worked on or done was the engines. The tour guide (a Purdue alumni) said that whenever an aircraft is delayed, its most likely the engines or seats on board. We were able to walk inside the new 787 and take a look around to see what a brand new one looked like. That was also an amazing experience that I will not forget. After the 787, we went to go see the 777-X. This experimental aircraft had fold-able wing tips to decrease the overall wingspan for taxiing and docking to gates. We did not get to see the actual aircraft, funny enough, they did not know where this monstrous plane had gone. That just proved to me how big this manufacturing plant actually was. Although we did not get to see the whole plane, we did get to see a wing of the 777-X, with the wing being stowed (up position). After the 777-X we went to go inside a 747 freighter, again this plane was huge and just an awesome sight to see. We walked around the deck and underneath it, being able to see all that was there and what still needed to be finished. At the end, we stumbled upon the Senior Director of the facility there. Another experience that was very enlightening. That summed up our tour of the facility, it was truly one of a kind experience for someone who has never been in an environment like that. 
     After the tour we attended the Raisbeck high school event, an event prior to the actual gala. Here we were able to talk to some very brilliant minds in the aviation world. I had the absolute honor of sitting next to Peter Morton, a 42 year Boeing employee retiring as Vice President. He asked me questions about the integration of unmanned aerial systems in U.S airspace and whether or not I believed UAS would take over manned aviation commercially. An awesome experience being able to talk with Peter. After this event, we were invited to go to James Raisbeck's house. There we were able to talk to the high school students who were apart of the event. We toured his house and priceless cars and artifacts. The next day was the actual Pathfinders Gala. The gala was held at the Museum of Flight, we were able to eat under some very special aircraft. There was an SR-71, Ford Tri-motor, and a Scan Eagle. The gala consisted of introducing previous Pathfinders and an interview with the two newest Pathfinders, Phil Condit and Dennis O'Donoghue. Both were interviewed about how they came to the aviation and lessons they had learned along the way. Very inspiring talks from both and showed how if you put your mind to something, you can achieve it. At the end, a surprise was announced that  Peter Morton was also receiving a Pathfinders award for his contribution to the Pathfinders event. Instead of two winners, there were three. Peter Morton was stepping down as the founder so the board found it appropriate that Peter should be awarded as well. 
     This Seattle trip was the best trip I had ever been on. It showed me how excited I was for my career in the next couple of year. I was able to spend time with a great group of Purdue students that became great friends, while learning about aviation. If I was given to chance to go back again, I would do it in a heartbeat. I learned a great deal about the industry and life lessons from the Pathfinders.

Tuesday, October 2, 2018

Ethical and Moral Dilemma of UAVs



               UAVs (Unmanned Aerial Systems) have been around for much longer than the general population is aware. Though these older systems weren’t finely perfected through advanced technology, they were all UAV just the same. Mostly used for surveillance and taking videos/photos, these UAV’s were highly experimental and generally didn’t last long. Manned planes were the dominating force because radio technology limited the reliability of UAVs.  The culture back then was almost very resistant to UAVs, describing them as toys. Another use of UAVs in the older times was target practice for manned pilots. The manned aircrafts were also much easier ethically. Bombing runs, precision strikes, and air support were all planed. From routes all the way to the pilot releasing the bombs, it was structured and organized. Once UAVs were armed, more specifically the predator, many concerns were highlighted. Who should give the call to launch a missile? Is it different from a manned plane? Is a UAV with a warhead legal under international treaties?
               Modern day, one of the most highly controversial topics surrounding the military is the use of drones to perform precision strikes on high value targets. Concerns vary from misinformation of a target, civilian casualties, and the effect of this on the pilot. In the book Predator, the original armed Predator 3034, was the first drone ever to perform a deadly strike. Following this milestone for military drones was a laundry list of legal and ethical questions that delayed progress. To begin off, the culture behind UAVs was miniscule and unsupported. Higher ranking individuals considered them toys and very unreliable. Although they were unreliable to begin off, they could be much more than toys. Pilots didn’t care for the drones either, which made finding pilots even harder. Once the technology started gaining traction and support, there were still many people that did not believe in this technology.
               Once the predator started to get serious support and use, people started seeing how useful this system was. One of the biggest uses was to keep an eye on potentially deadly targets. The most notable one was Osama Bin Laden. After watching him for a considerable time, the predator support team realized how useful it would be to be able to have a “see and shoot” system. The predator’s main job during this time was surveillance and buddy lazing for airstrikes. If the team had the chance to strike, they should be able to without having to wait for an air strike. This came from an incident where the predator drones crew was deciding whether to laze a target, they kept on waiting for confirmation when they overheard a nearby airport getting ready to deploy a jet. They decided that if the enemy saw the drone, they would shoot it down and cause a fiasco for the U.S government. So, they backed the predator drone off. If they would have been able to bomb the target, being Osama Bin Laden, they could have potentially prevented 9/11.
               Once the drone was finally in the process of being armed, they were bogged down by bureaucratic laws. One treaty signed by the U.S and formerly known Soviet Union, barred the use of anti-ballistic missile systems. Lawyers and bureaucrats argued that putting a Hellfire missile on a predator drone could constitute a system described in this treaty. They could not do any flight testing or put a missile on this system until they had the green light to do so. Eventually, they argued that the predator drone does not have a warhead in its nose, but rather on a separate pod, and the project was a go. Even after the project was completed, the question of who gets the fire the missile was still up in the air. The chain of command for giving an order was completely unorganized. Even to this day, there is still debate about who should give the order to launch a precision strike.
               In opinion, given good information and actionable intelligence, drone strikes are war marvels. Being able to loiter around a target suspected of carrying out a terrorist attack, then assassinate them when sure of their identity, can give the upper hand in the war against terrorists. Although this can be useful, this can all take a deep dive into tragedy. Civilian casualties are the biggest concern to me. It is best for the nation when the United States takes out a terrorist, but if you kill civilians at the same time, you may have just made more terrorists. Loss of innocent life is truly a tragedy that needs to be avoided at all costs. It is not only illegal and backwards to our cause, but it can destroy the life of people who have no part in terror and only want to be left alone. Some people equate “pulling the trigger” on a strike in a drone is no different than a video game. Flash of light on a screen of the missile being launched at a white glob, then boom target is gone. Ensuring no civilian casualties should be number one concern when performing a precision strike on a high value target.
               Another ethical concern is privacy. While to me I don’t understand the big deal behind this one, many people do. Should the government be able to watch the civilians with an unmanned aerial system such as a predator drone? Personally, satellites and spy planes, such as the legendary SR-71, have had the opportunity to spy on anyone since they were first designed. Both old technology but principle is still the same. Relating this to drone technology is no different. Should a random civilian be able to fly his/her drone over your property? When is it considered stalking or spying? Can the owner take counter-UAV actions? I think that adults are (hopefully) mature and smart enough to tell the different between someone stalking you and someone taking a nice picture of a sunset. If the drone is hovering over your house and taking pictures of inside your window, there’s a good chance they aren’t enjoying the sunset. So, in my opinion, if there is someone flying that close to your property taking odd photos, yes you should be able to take it down. Even that spurs many questions, what is too close? What legal way can you shoot it down? Who pays for damages in this scenario?
               Ethical and moral dilemmas surrounding the drone world will most likely never go away, but instead evolve into different questions. Drone have the fantastic ability to perform missions never heard of before with relatively cheap technology. Even with the expensive military technology, drones can perform very dangerous missions in enemy airspace with no danger to human life, only an expendable machine. Like every other machine and tool, they can help mankind with a plethora of issues to make life easier. And on the opposite end, they can also have the power to deal great damage. Responsibility and teaching each other about these tools can make a great impact.
                

Monday, October 1, 2018

Capstone Proposal

Below is the power point of my group proposal. We decided to pick this specific topic because it involved areas of expertise that all of us had. Evan has the data processing and analyst, I have the experience and interest in integration of sensors, and Thomas has the experience with flight paths. Below is the abstract for our project with the power point followed.

Tuesday, September 25, 2018

C-Astrals' Bramor Demonstration

     For a special treat, we had the honor of listening to a presentation of the CEO of C-Astral. He gave a very informative presentation of his experience leading up to C-Astral and the companies achievements. He talked mainly about what the platform at C-Astral can do. We were even able to see what was up and coming to the C-Astral fleet. The Bramor is a flying wing perfect for any consumer interested in flying commercially. Further down this report you will pictures of the Bramor. After the presentation we had the special opportunity to get to meet this aircraft. The test flight was to be done at Martell Forest. It was going to be a simple mapping mission, mainly to show the class the pre-flight check list and the professionalism behind larger commercial operations. The previous UAS classes had checklists, but for 3DR solos and simple slow sticks. Which contained a maximum of 10 steps or so. This bulky checklist had around 60 steps, quite a step up from the 3DR's.
     When we all first arrived we were watching Evan and Pete performing the pre-flight checklist. The ground control station (GCS) was already set up. We all watched as Evan and Pete carefully performed the pre-flight checklist. The checklist was more involved than previous ones we had done. The person reading the checklist off was to listen for a certain verbal cue to move on. Depending on what was checked off, the person physically checking would say check, clear, rubbers on, etc... for different actions. Since pictures speak a thousand words, the post will now show what we as a class got to experience.





















Upon arriving at Martell, the first thing we saw was C-Astrals Bramor. Roughly 4 feet in wingspan and less than half of that front to back, the flying wing had a good size to it. Asking Marko (C-Astral CEO) about the airframe design and material, he said it was made of Kevlar and reinforced fiberglass. Looking into the airframe you could also see carbon fiber.







To the left, Dr. Hupy (far right with the boonie hat) and Pete (far left with the boonie hat) troubleshooting an issue with the Bramor. Still performing the pre-flight checklist. Looking closely, just aft of the midsection of the flying wing you can see a red box like figure. This is the parachute, the Bramors only method of landing (safely).






After performing all pre-flight checklist items, the Bramor was now ready to fly. The bungee assisted catapult launch was ready and sprung back. With the release of a button the Bramor would be launched into the air to begin its mapping mission.

This video is the Bramor as it is being launched, as you can see the catapult is very powerful. In less than a second it is accelerated from the catapult into flight. 


The Bramor has a very interesting method of landing. And that is a parachute. This method of recovery is used sometimes in applications but more rarely. The software used can roughly give users a good estimate of where the aircraft will land after parachute deployment to ensure the aircraft is not landing in hazardous or less than desirable terrain.







While we were there we were also able to watch a flight of the hexacopter there. While not C-Astral's airframe, it was still a very interesting airframe. One very cool sensor this airframe had was a thermal imagery sensor. There are no photos of this due to the environment causing to much reflection but seeing this sensor in action was very interesting. The hexacopter was flown to get thermal imagery of the field. Below are a few photos of it.
The hexacopter with the "cockpit" open


Hexacopter without the arms, preflight




GCS for the whole operation of both Bramor and hexacopter

Wednesday, September 19, 2018

Mapping at Dr. Hupy's House

     Since mapping is the one the most basic requirements for this class, our lab was to do some test flights with a DJI Inspire. In teams of two, our goal was to come up with a pre-flight and post-flight checklist for the Inspire. We also made a pre-flight checklist for the Red Edge camera.
     For the entirety of the three hours, we flew. My partner, Kyle Sheehan, was the first to fly. I was his visual observer (VO). Our mission was to fly from the end of Dr. Hupy's street to the soy bean field and take video of the entire journey. The entire mission took roughly 5-10 minutes and was more of an introduction to the DJI Inspire. To me, flying the Inspire was almost identical to flying the Mavic that I own. DJI has a the same user interface app for all of their commercial level drones. DJI has a very user friendly interface and design. The only different aspect to me, was the cost of each drone and the sensor package. Below are some photos that I took to document different stages of the flight. 
Above is Evan Hockridge and Krysta Rolle performing their pre-flight checklist for the first flight of the day. Dr. Hupy is helping out for the first couple of flights to ensure no mistakes are made unintentionally. 
Take off! The first flight of the semester, the DJI Inspire took off flawlessly. The weather for this day was perfect, low winds, sunny, and almost no clouds in the sky.
Pilot in Command/Technician Kyle Sheehan performing a pre-flight check on the DJI Inspire. He decided to fly first, then I flew second. Afterwards I became the Pilot in Command and technician. We both flew out missions without any mishaps. Our missions were identical, we both flew from the end of Dr. Hupy's road to a soy bean field. Taking video the entire way to eventually create a map out of the data.

Below is the checklist me and Kyle used for both of our flights.

Checklist 
DJI Inspire Checklist 
  • Ensure Pilot in Command is in good mental condition (sufficient sleep, more than 10 hours, and no mind-altering substances) 
  • Ensure battery percentage level ≥ 95% 
  • Ensure controller is charged ≥ 95% 
  • SD card inserted in and has sufficient space for data  
  • Ensure firmware is up to date and installed on aircraft  
  • Clean camera lens of any debris or obstruction 
  • Check image on viewing device for any obstructions or damage to camera, or blurriness 
  • Check airspace restrictions 
  • Check Wx 
  • Precipitation less than 5% 
  • Winds under 10 knots 
  • Ensure visibility is enough and legal 
  • Ensure flight is in between civil twilight hours (see local time for specific times) 
  • Ensure flight area is unobstructed and safe for flight  
  • Pre-flight check on air frame 
  1. Camera locked in place 
  1. Motors clear of FOD 
  1. Check propellers for damage 
  1. Ensure propellers are locked into place 
  • Ensure take off area is unobstructed 

Red Edge Checklist 
  • Ensure RedEdge has SD card inserted in 
  • Connect GPS module 
  • Connect power cable 
  • Power on the camera with the On/Off button. 
  • The LED will remain off while the camera turns on 
  • Ensure camera is connected through Wi-Fi 
  • Make sure status information is as follows 
  1. # of satellites used  
  1. Altitude is set to 0 AGL 
  1. Check the signal strength of each satellite 
Post Flight Checklist 
  • Power off aircraft 
  • Power off Red Edge 
  • Remove SD card from payload 
  • Connect SD card to computer to ensure quality images 
  • If image quality is good: continue 
  • If image quality is poor, adjust accordingly and fly again
  • Power on drone and reconnect to controller 
  • Put drone into travel mode by flipping landing gear switch up and down 5 times 
  • Power down drone 
  • Remove battery 
  • Power down transmitter 
  • Remove props and place in case 
  • Disconnect and remove Red Edge sensor and place it in its appropriate housing 
  • Put drone and transmitter in their case and lock 

x