Monday, March 30, 2015

Data Collection II

Introduction:


In this assignment, we collected microclimate data throughout the UW-Eau Claire campus, using similar methodology as last week's exercise. This involved deciding on a standardized geodatabase and feature class for use, deploying it to each Juno Trimble GPS unit, collecting microclimate data in pairs using Kestral weather meter, checking in each group's data, and merging all data into one cohesive microclimate dataset. With professor Joe Hupy gone for the day, we were required to work together to properly carry out these procedures, involving special help from students Zach Hilgendorf, Aaron Schroeder, and Michale Bomber, for the tasks of distributing GPS and Kestral units, deploying data to them, and checking in/merging data after collection.
Juno Trimble GPS unit. This unit uses ESRI's ArcPad application, which allows for GPS collection into a geodatabase.
Kestrel weather meter, which can be used to read temperature, wind speed, wind chill, dew point, percent humidity and a number of other climate figures.


Methods:


Before heading out into the field, it was important that everyone was able to deploy the data properly to their Trimble GPS units to ensure standardized data collection. This involved the same methodology as in the previous exercise. Refer to previous blog post for more information on this process. Armed with our GPS units with properly deployed geodatabases, and Kestral weather meters, we were ready for the next step.

We divided our UW-Eau Claire campus area of interest into 7 different sections: one for each group of 2 students. My partner Nick Bartelt and I were assigned the northernmost section, the one labeled '8' above. This area includes the campus footbridge and the area around Haas Fine Arts Center.

Recall that our microclimate data collection included taking readings on the Kestral weather meter for temperature at the surface and at two meters, wind speed, wind chill, dew point, and humidity. It is worth noting that even though we included a field for wind direction, we excluded it from data collection because we didn't have a tool to take the reading quickly and accurately. For further information on these fields, refer to the previous two blog postings. When we went into the field, we began with Nick taking the Kestral readings, and myself recording them into the Trimble GPS unit. We began along the footbridge, continuing on around the trails near Haas Fine Arts Center. We switched half-way through our study and ended up collecting over 60 GPS points. We had very few hitches in data collection thanks to our previous test run.

After collecting as many points as possible in our allotted time-frame, we returned to the classroom to check in our data. The classmates mentioned above helped us with this process, and completed the processing by merging each group's data into one point feature class. For more information on this, refer again to the previous blog post.


Results:


I chose to process the GPS point using the IDW (Inverse Distance Weighted) spatial interpolation method. This method estimates cell values by averaging nearby cell values in a weighted manner; cells that are closer to the cell value being calculated have more influence on the averaging process. For more information on this interpolation method, see ArcHelp. I overlayed the interpolated surface on top of the basemap for reference. 


This image shows the different zones used, along with green points for each point that was taken by the class. This covered a wide array of terrain, from the middle of the footbridge, to points taken off trail in the woods. The points covered a large portion of the UW-Eau Claire campus in order to provide some variability in the following datasets. 

This image shows temperature variations from readings taken at 2m high, across the UWEC campus

This image shows temperature variations from readings taken at surface level, across the UWEC campus

This image shows wind chill trends

This image shows the changes in dew point across the UWEC campus

This image shows variations in wind speed throughout the UWEC campus

This image shows changes in humidity throughout the UWEC campus

Discussion:


When analyzing the above datasets, it becomes apparent that there are certain errors in the data. With different students taking readings across campus, it is inevitable that readings will vary slightly. Sometimes the Kestrel meters need a chance to cool down, or warm up. This can influence how the readings are recorded. Also, if a gust of wind comes at just the right time, the Kestrel will calculate a much lower wind-chill. The main idea here, is that our dataset felt the results of temporal variation- which was not accounted for in the study. This includes during each reading- ideally, one point on the map would include microclimate information from just one point in time when really taking each point took several seconds. During these seconds, conditions were prone to change, resulting in unwanted variation in our data. Perhaps even more importantly, the weather conditions were changing as we continued in our data collection process. Referring to the Temperature maps, when we started at the base of the footbridge, we were getting temperature readings at the top of our domain (60 degrees F), when at the end we were getting down into the 50's and even 40's. The sun went away during the sampling as well.

The temperature interpolated surfaces came back pretty logically, with high readings on lower campus in sunny areas, and colder temperatures back in the shady wooded areas.

The wind chill map has some strange results. There is a severe outlier in the quadrant labelled 2, on upper campus behind governor's hall: It is hard to miss. Either there was an input error here, or someone's Kestrel read an outlandishly low number, but the result is that the entire map is less functional. The IDW tool splits the resulting surface into nine zones, so an outlier like this decreases the interpretability of the map. The majority of the colors shown in the legend are present around the site of the erroneous point, where just two or three are used throughout the rest of the map, representing actual wind chill values.

Dew point also has a couple of outliers, but they don't appear to affect the overall interpolation, probably because of the availability of sampled points in the neighborhood of the erroneous points. Basically, the IDW tool doesn't need to interpolate as much around these points as it did in the wind chill map.

Wind speed yields a logical map, with higher values on the footbridge and along the shore of the Chippewa river. There are a number of random other high values as well, but since wind is not constant, these are most likely attributable to gusts. Also, upper campus seems more consistently windy, as is expected.

The percent humidity map shows that the wooded areas sampled usually had high humidity readings. This is possible because of moisture captured by the trees along with shaded areas maintaining snow cover on the ground.


Conclusion:


This exercise was a good introduction to doing field work in a team setting. The fact that Professor Hupy was absent added an interesting dynamic, as students were required to work together and troubleshoot. It is important to be able to all have the same goal when collecting data, and to do so in standardized fashion. It was important that everyone knew the plan when getting ready to collect field data, and that we knew how to compile it afterwards. Also, This exercise introduced us to collecting microclimate data, which can be used in many different areas of interest to highlight variations in climate with respect to surrounding areas. This lab also demonstrated theses variations for our UWEC campus based on its various physical features.

This exercise coupled with the previous ones, ultimately included creating a geodatabase with proper domains, subtypes, featureclasses and fields, deploying it, collecting data, checking it back in, and analyzing data. I feel comfortable with this process after having done it, and am sure that it will be very useful to be familiar with for future field work. 

Sunday, March 8, 2015

Data Collection I

Introduction:


In the previous assignment, we were assigned to create a geodatabase to suit our microclimate sampling exercise. The purpose of this week's lab was to test the use of GPS with our previously created databases and work out any kinks that we might run in to before next week, when we'll be collecting the microclimate data. We familiarized ourselves with the process of readying a geodatabase for use in ArcPad, deploying it to our GPS units, collecting data, and checking the data back in. We used Juno Trimble GPS units, inputting data from a Kestrel weather meter.

Juno Trimble GPS unit. This unit uses ESRI's ArcPad application, which allows for GPS collection into a geodatabase. In this case, the one from last week's exercise.
Kestrel weather meter, which can be used to read temperature, wind speed, wind chill, dew point, percent humidity and a number of other climate figures.


Methods:


The majority of this lab was done from inside, readying the geodatabase for use in the field. As described in the previous exercise, this step is especially important because it reduces unnecessary work while in the field.

First, I connected the Trimble unit to the computer, and readied my geodatabase for deployment. This involved opening ESRI Arc GIS, enabling the 'ArcPad Data Manager' toolbar, adding a basemap, and including my microclimate point feature class described in the previous exercise. For the basemap, I used a 2013 aerial image of the area, zoomed into the UW-Eau Claire campus. Upon "getting data for ArcPad," only the extent of the area that I was zoomed into would be included. In this step I also checked out the microclimate feature class, and ultimately created a package that is deployable for ArcPad. I copied this package (a file on Windows File Explorer) into my student folder as a copy in case the deployment didn't occur properly. I also copied it onto the Trimble unit's memory card. This made it available for use in the field.

The Get Data for ArcPad tool. This tool essentially takes the feature class and background image shown, checking them out for editing, and creating a package that is compatible with ArcPad. This package is then copied onto the GPS unit, and it will allow for digitization in the microclimate feature class. Play the video below for further information on data deployment. 


We were to go into the field in groups of two, so that we could assist each other in collecting points, but as soon as I got outside I realized I had an issue. My microclimate feature class had no projection defined, so the GPS functionality didn't have a spatial reference for my points. This meant that I couldn't digitize, so I had to go back inside to reasses. I had to use the define projection tool in ArcToolbox on my microclimate feature class, defining as WGS 1984. I figured that this GCS would be compatible with the Trimble Units, because they take points in a GCS as well. By the time I had edited my feature class and went through the above process again, most of my classmates had already completed taking their data. This meant that I had to go out alone and take my own Kestrel temperature, wind speed, wind chill, dew point, and percent humidity readings. I recorded these in my ArcPad session for only three different points on the South side of campus.

The final step was to re-check in our data, using a similar tool as was used to check it out earlier in the exercise. This part worked smoothly, and my points were added back into an ArcMap session.

Discussion:


There were quite a few hitches that the class encountered in this lab, and it is good that we did now, as opposed to having them happen when we do our real data collection in the next exercise. I learned the valuable lesson that for features to be edited in ArcPad, they must have a projection defined. Also, I noticed that my GROUND_COVER field was actually called NOTES. That is an issue because I also have an actual NOTES field as well. This will need to be resolved before next week's data collection exercise. At the end of the exercise this week, we voted on one student's geodatabase as being usable for the rest of the class for further microclimate surveying on campus. This means that all students will be using the same geodatabase with the same domains, basemaps and feature classes, which will minimize discrepancies in the final dataset.


Conclusion:


This exercise was valuable because it allowed us to work out common issues that can occur when using GPS units to digitize data. Knowing how to deal with these issues is a very important skill to have when doing geospatial field work. Also, the exercise included valuable information on how to operate a Kestrel weather meter, and refreshed my memory on operating a GPS unit to digitize point features. Proper deployment of data is also very important, and the class experienced some of the issues associated with it. 

Sunday, March 1, 2015

Geodatabases, Attributes, and Domains

Introduction:


One of the most important data models in modern GIS is the geodatabase. It is the main mechanism used to use, organize, and store geographic data in ESRI ArcGIS. The geodatabase is object oriented, meaning that features and attributes are stored together in a single unit, or object. This is preferable to the georelational model, in which features and attributes are stored separately, but linked together by a common feature ID. This is the traditional model, and its use has decreased since the conception of ESRI's geodatabase.

There are a number of ways that the geodatabase can be set-up to reduce overhead in collecting data, but especially important in this are domains, which restrict the way data can be entered into the desired feature class. This allows the geodatabase designer to specify a given range that an attribute could fall within, or allow for a selection from pre-specified values. Imagine being out in the field to digitize locations of trees. For each tree point you need to take down some attributes, like ground cover nearby and tree height. In any study minimizing field time is important, and a domain can help achieve this. If you accidentally inputted 100 meters instead of 10 meters for a tree's height, a domain would recognize that that fell outside of the domain that you specified, being from 5-25m. You would be able to recognize the input error, and correct it before bringing it back to the analysis stage, where this erroneous value would negatively effect your data. A domain could also be used for the ground-cover, providing a list of options to select, rather than forcing the user to manually enter in the same values over and over again (ie. grass, dirt).

In this exercise, we were tasked with the construction of a geodatabase, development of domains, and creation of a feature class for use in our microclimate survey that will be conducted later this semester. For the survey, we will be collecting many different pieces of data so it is important to consider how we want to store them. Generally, it is a good idea to use fewer feature classes with multiple attributes, which results in a clean, logical and queryable dataset usable for further analysis.

The data pieces that we will be collecting include: wind speed, wind direction, humidity, dew point, surface temperature, temperature at 2m elevation, wind chill, and notes. As I just mentioned, it is often advisable to aggregate data pieces into a single feature class, so that is what I will do. This will result in a single microclimate feature class, containing attributes that detail each of these different measurements.

Another important consideration when designing a geodatabase is the desired data type for the fields that will be used. In this example, the two general data types that will be used are text and numbers. However, there are a number of different number types that can be used, each with its own pros and cons.

This table shows number data types, detailing their storable range, application, and storage size.
For this particular study, I decided that using floats would be the best option for numerical data storage, because I wanted to be able to include decimal values, but didn't need the data range that doubles can provide.

The next step is to decide what kind of domains will be needed to facilitate field collection. I will now outline the conceptual basis for each domain, before providing a detailed tutorial on the creation of the geodatabase, development of domains, and creation of feature class.

Wind Speed: The wind can't go lower than 0, and it will almost certainly not exceed 50mph. As mentioned earlier, this will be stored as a float.
Wind Direction: This will be an azimuthal measurement taken with North as 0 degrees, up to 360 degrees. Also will be a float.
Humidity: Humidity is recorded as a percentage from 0 to 100%, which represents saturation.
Surface Temperature: Temperature normally doesn't fall below -40 degrees Fahrenheit even at this time of year, and certainly won't exceed 60.
Temperature at 2 Meters: Will use same domain values as above.
Dew Point: Since dew point is based on temperature, it makes sense to use the same domain range values as the temperature fields will be using.
Wind Chill: Won't drop lower than -40 either, so same domain can be used.
Ground Cover: Possible values include grass, snow, concrete, blacktop, gravel, water, or sand. It is also advisable to include an "other" option, in case an unforeseen value arises.
Notes: No domain will be used, but this field will be valuable in attaching any other important information to each micro-climate point.

Methods:

Begin by opening ArcCatalog and navigating to the folder that you would like to store your geodatabase in.

Select folder, and right click to choose New -- File Geodatabase
New File Geodatabase.gdb will be shown under the contents tab, so you can right click on it and choose rename, and call it Microclimate.gdb

Next, right click on your Microclimate.gdb and choose Properties.

Database Properties pop-up. Notice that I selected the Domains tab, as that is what we will be working on next.
The above window is where we will define the domains outlined in the previous section. Simply click the Domain Name box to create a new domain, and be sure to give it a relevant description.

Here is the populated Domains tab. Currently selected is the Ground Cover domain, which has a field type of Text, and a domain type of Coded Values. This basically means that fields with this domain will provide a selectable dropdown list of values, so they won't have to be typed repeatedly in the field. Click inside of the Code area to specify a relevant code, then click in Description to add the appropriate land cover description as shown above. Refer to the previous section for complete list of coded values for ground cover. 
The rest of the domains that will be added are numerical domains. They each use a floating point datatype, and just require a range of acceptable values to be specified.

The temperatures domain is selected. Note that the field type is a Float, and the minimum value is -40, and the maximum is 60. These represent the minimum and maximum acceptable values for temperature in Fahrenheit.
Add the remaining domains in this manner, specifying the minimum and maximum values illustrated in the introduction section. Remember to provide relevant descriptions to each domain.

Next, we will need to create a new microclimate feature class.

Right click on the Microclimate.gdb and select New -- Feature Class as shown above. 
A new pop-up window will appear, name your feature class microclimate_[your username], and provide an alias if you'd like. Under Type, choose Point Features. This specifies that the feature class we will be using will be points. Click Next.

You will now be prompted about the coordinate system that will be used for XY coordinates in this dataset. For now, we will leave this blank. Choose Next. An XY Tolerance option will appear, accept the default value (should be 0.001 Unknown Units). Click Next. A Configuration Keyword option will now appear, accept Default and choose Next.

Now, you will see the window shown below.

This is the window that allows you to add fields. These will be similar to the domains we added earlier, but will more closely follow the outline from the end of the introduction section. Recall that we will have the following fields: wind speed, wind direction, humidity, dew point, surface temperature, temperature at 2m, wind chill, ground cover, and notes.
Here are the Field Names and Data Types. Use either all caps, or all undercase letters, and don't allow spaces, dashes, or special characters in the field names. Also, be sure to choose Float as the Data Type, because our domains will only appear if the data-types match. 
This area will be shown below the Field Name/ Data Type window shown above. In the Domain area, be sure to select the corresponding domain for each field. For example, for WIND_SPEED, choose the Wind Speed Domain from the drop-down. You won't see our Ground Cover domain as an option, because that is a text-based domain. On the GROUND_COVER field, choose text so the Ground Cover domain will be available. 
Set up your remaining domains in this fashion. Remember that DEW_POINT, TEMPERATURE_SURFACE, TEMPERATURE_2M and WIND_CHILL will all be using the Temperatures domain. NOTES will use no domain at all, allowing us to input any notes, not restricted by values.

It is also advisable to import basemap data into your geodatabase for reference while in the field, and the process for doing this depends on your area of interest, and data availability.

Discussion:


The geodatabase, domains, and feature class that were just created will be helpful for our microclimate survey, and will reduce the busy-work that would be done in the field by taking multiple points for each attribute. It will allow for stream-lined data collection, taking microclimate data readings at each point, and simply inputting the data into each proper field on the GPS device. It is also worth noting, however, that over-aggregation can also be problematic. If there are two different data pieces that have distinct functions, they should be put in separate feature classes.

Conclusion:


This exercise was valuable, because as I've demonstrated, proper planning for a field project and database design that accommodates its goals are very important in the geospatial field. It reduces overhead, allowing for quick, concise data collection and higher data reliability because of the elimination of many potentially erroneous values. It is also valuable to be able to walk a potential classmate or co-worker through the process of designing a geodatabase, which is what this exercise ultimately called for. 

Sources:

Geog 337, 336 Class Notes
ArcGIS Help. (2013, July 30). ArcGIS field data types. Retrieved March 1, 2015, from http://resources.arcgis.com/en/help/main/10.1/index.html#//003n0000001m000000

UAS (Unmanned Aerial System) Mission Planning

Introduction:


Unmanned aerial systems (UAS) are an increasingly prevalent technology that are used to accomplish a wide range of tasks. UAS often get associated with military operations, and often the word "drone" is used incorrectly to describe just one aspect of UAS. As the correct name implies, UAS consist of much more than just an unmanned vehicle. The consist of a platform, or flying device, a sensor, a ground station, radio control, and sometimes autopilot hardware. There are a number of other parts that can be included in UAS depending on it's use.

Not only does the word "drone" have a negative connotation, it also inaccurately describes UAS, implying that they are unmanned, unpiloted robots that operate free of human input. This is false, and the truth is that UAS should always be piloted, whether that be computationally or manually. Best practice for UAS is to have a 3 person team, consisting of a Pilot In Command, who mans the physical controls of the unmanned aerial vehicle (UAV), a Pilot At Controls, who operates sensors, and monitors from a computer, and an engineer/spotter who keeps close watch of the UAV, providing assistance to the PAC and PIC as needed.

UAS often are associated with miliary utility, but it is important to recognise their usefulness in many different situations. In geography, they can be used to assist in research providing aerial imagery, atmospheric monitoring, and many other remote sensing data. It can be a good alternative for other remote sensing utilities because it provides high spatial resolution, rapid results and possibly higher temporal resolution, at a reasonable price.

In this exercise, I looked at some unmanned aerial vehicles, and used a flight simulator software to get a better understanding of the different types of UAV's before acting as a consultant in various situations, recommending a UAS for assistance in various types of research.

Methods:


The first component to this exercise was to use a flight simulator to log flight time on a number of different types of UAV to get to know the strengths and weaknesses associated with each, and ultimately begin to understand some of the possible situations in which they could be used. Doing this, we used RealFlight 7.65 with real UAS remote controllers to try different platforms, utilizing a number of different flight views.

The first platform that I demoed was the Hexacopter 780. This is a multi-rotor platform with 6 propellers that is quite stable, and capable of slow flight speeds, which would allow for high spatial resolution aerial imagery and other remotely sensed data. However, multi-rotor platforms can't manage a very heavy payload, and since they have so many rotors, their battery life time suffers.
I logged around 30 minutes of flight time on the hexacopter on an ocean map with a shipwreck and a number of obstacles. I kept the wind low, as this was my first simulation, and the controls were hard enough as they were. I had five crashes in my 30 minutes of fly time, most of them involving crashes after attempted landings, or collisions with obstacles. 
The next platform I demoed was a fixed wing UAV called the Slinger. Fixed wing platforms are often useful for managing heavy payloads over large study areas. One main detriment to fixed wing UAVs is that they can only slow down to a point (where they will no longer provide lift), and thus cannot achieve as detailed remotely sensed image data as multi-rotored systems.
This is the Slinger, a common fixed wing platform. The controls were very different from the hexacopter, with much higher speed and touchy turns. I demoed this UAV at the Sierra Nevada map, experimenting with heavy winds. It was affected quite significantly by the presence of winds, making potential sampling impossible with adverse conditions. The chase view was the most fun of the different camera views, but I used the fixed view as well, which required some critical thinking about the relative controls. I had 6 crashes in the ~30 minutes that I tested this platform. Most of which were due to my overconfidence, testing maneuvers that would not be acceptable in the field.  
Next, I tested a Quadcopter X platform. This is a multi-rotor platform with just four rotors. The fewer rotors a UAV has, the less stable it is, especially in adverse conditions. However, it can have a longer battery life than multi-rotor platforms with more rotors.
I flew the quadcopter on the junkyard map, for about 30 minutes, experimenting with different views, including the 1st person view. This view allows you to have the perspective as if you were on the nose of the craft. This technology is widely used in real UAS scenarios. I had trouble with the fixed view, crashing four times within the first couple of minutes I attempted takeoff. I also found that succesfully landing the quadcopter was very difficult and required a very light touch with the accelerator joystick. I had a whopping total of 12 crashes in my half hour of flight time. 
Finally, I used the P-51 mustang platform for testing. This is another fixed wing craft, but it is significantly larger than the slinger model, and takes off from a runway on the ground. This model is also gas-powered which allows for longer flights.
I tested the P-51 Mustang on the Junkyard environment, and used the fixed view, chase cam, and cockpit cam for flights. With the Mustang I was able to achieve successful landings! I only had one or two successful landings with the other crafts, but this one I was able to land repeatedly. The controls were less touchy than the slingers', and it flew faster. I only crashed four times in the half hour of flight time logged.  

The final part of this exercise was to use our acquired knowledge to provide consultation for some scenarios created by our professor about possible use of UAS'.

Scenario 1: An atmospheric chemist is looking to place an ozone monitor, and other meteorological instruments onboard a UAS. She wants to put this over Lake Michigan, and would like to have this platform up as long as possible, and out several miles if she can.

Since she wants to cover a large area with a long flight time, this is a good scenario to use a fixed wing craft in. Though I only tested a couple of fixed-wing crafts, I believe that a larger one with a high payload would be advisable to accommodate multiple meteorological instruments. With this in mind, perhaps using a larger, gas-powered UAV would be in her best interest to increase flight time and distance. This would be more expensive, but necessary to accommodate her daunting task of mapping a large chunk of lake Michigan.

Scenario 6: An oil pipeline running through the Niger River delta is showing some signs of leaking. This is impacting both agriculture and loss of revenue to the company.

Depending on the magnitude of the oil leak at hand, a number of different UAS could be used. Since the scenario reads that the pipeline is showing signs of leaking, I will assume that the company is unsure of the problem and its whereabouts. This implies that there is a large area that must be covered to monitor the potential problem, and that sounds best suited for a fixed wing aircraft. Its low flight capabilities, long flight times and speed of data retrieval definitely demonstrate its preferability to satellite imagery, and because of this scenario's large geographic extent, a fixed wing craft is better suited for the task than a rotorred or multi-rotor system.

Conclusion:


This exercise was a valuable introduction to the growing field of UAS technology. These systems facilitate data collection for many different tasks, many of which are directly applicable to geography and geospatial data. Knowing the basics of what these unmanned aerial systems consist of (and what they don't consist of) is important in knowing ways in which we can solve geographic problems. More generally, being able to communicate our knowlege to a potential client is an important skill for workers in the geospatial field, as many clients aren't familiar with the technology. It is important to be able to look at a problem, and find a way to quantify it and collect data.

Sources:

http://www.iupac.org/publications/pac/special/0199/pdfs/engelhardt1.pdf