Thursday, August 10, 2017

Final Project

Last entry for GIS 4048! End of the semester. This entry is for my final project. Our assignment was a location decision just like a mod several weeks back. The big difference though, this time we had to find our own data. The learning objectives for this assignment are:

  • Organize and execute research project in a systematic and timely manner
  • Outline the aspects of a client’s GIS needs and develop a practical project plan for addressing those needs
  • Design, compile, and implement (1) a spatial database with relevant data from various sources and (2) a set of analytical tools appropriate for the problem/application
  • Assess, select, and apply technologies that will contribute to project objectives
  • Demonstrate ability to implement geographic analysis solve a problem
  • Communicate the GIS project process and the results in written, and graphic medias at a professional level
I liked doing this project. This was the first time I felt like I was truly doing GIS work. The reason I felt this way was I had full control over the project. I had to choose my subject and area of research, perform data collection, data preparation, and conduct analysis. 

I choose to search for a suitable area for a client to purchase a house in Austin, Texas. Here is an image of my Base Map:



The client had 6 criteria that they wanted the search to include. I collected data on the area and pertaining the criteria. Most of the data used was in excel files so I had to do joins with shape files to be able to use the data in the analysis. once I had the shape files, I could convert them to rasters. 

With all of the rasters created, I made a weighted overlay model to run my analysis. The results were not what I expected but that is why we do analysis to find the truth. Here is an image of my final analysis:


This has been my favorite class so far in my GIS studies. I can't wait for more!
Props to Penelope and Jeremy!!

Mod 10 and 11

This week I'm doubling up on blog post. I got a little behind but I'm back on track. What a wild semester! Let's dive in, Mod 10, Creating Custom Tools and Mod 11, Sharing Tools.

Mod 10


Mod 10, consisted of the benefits of creating script tools, editing existing tool codes, setting tool parameters, setting script tool parameters, and customizing messages to debug script tools.

The exercise was to create a basic script tool. I was given a multiclip tool and needed to create a toolbox to run the script. Once the toolbox was created, I imported the script.

With a toolbox and a functioning script created, I set parameters for the tool so that users could choose what inputs and outputs. Here is a screenshot of my final tool interface that I created.


With the parameters set and tool set up, next I needed to edit the script to run in the tool. I used the arcpy.GetParameters() function to tell the tool where to get each parameter. The tool ran a multi clip function that clipped multiple shapefiles inside the Mexican state of Durango, but with this tool, it could have been used with any kind of dataset that needed to be multiclipped. Here is a Screenshot of my results window showing a successful operation.


Lastly, I paired the toolbox and script so that they could be shared and put them in a zip file. 


Mod 11

Mod 11 was a continuation of mod 10. This week we focused on different methods for sharing script tools, file structure for sharing script tools, identifying data and workplaces for script tools, creating a geoprocessing package, embedding script and password protecting, and script tool documentation. 

The assignment this week was very similar to mod 10. We learned about sys.argv[] expression and how the differed from the arcpy. GetParameter() function. 

I needed to take a script that created random points and then placed a buffer around them, and essentially do everything that we did in mod 10 with the exception of the sys.argv[] expression. 

I edited the script to work with the parameters and lastly password protected the toolbox to allow sharing and not have to worry about unwanted eyes prying on the script. Here is a screenshot of the parameter input and the final map it created. 



This was the conclusion of the class and what can I say; I have a new found respect for computer programmers. I first thought that Python was illogical and had no structure, put a colon here, indent here, the list goes on, but towards the end of the class i started to see the patterns and realized that everything has a reason. After i decompress from this class, I will want to dive a little deeper into GIS programming. What can I say about Dr. Morgan; if I was him, I would have told me to go get bent a long time ago, what a patient guy. Kudos to him.  

Sunday, July 16, 2017

Mod 8: Location Decisions

This week's lab was done in two parts. The first part was local property assessment, and the second was location decisions. Here are the questions asked during the property assessment exercise:

1. Conduct a web search to locate a property appraiser’s office in your area.
Q1:  Does your property appraiser offer a web mapping site? If so, what is the web address? If not, what is the method in which you may obtain the data? I live in Pueblo County, Colorado and County Assessor does have a web mapping site. The address is www.county.pueblo.org.


2. Most property appraiser’s websites offer a list of recent property sales by month. Search for the month of June for the current year and locate the highest property sold.
Q2:  What was the selling price of this property? What was the previous selling price of this property (if applicable)?   Take a screen shot of the description provided to include with this answer.  My county hadn't updated their records yet so I looked at the May records. The highest property sold, sold for $325,000. The previous selling price was $290,000 back in 2014.



3. The selling price and assessed price will differ in most cases (higher or lower). You may choose to search for a different property based on location/owner name or use the same result from the previous question to answer the following questions.
Q3:  What is the assessed land value?  Based on land record data, is the assessed land value higher or lower than the last sale price?  Include a screen shot.  The screenshot above shows an assessed value of  $23,330. This is far below the last selling price. I noticed most of the houses sold have assessment values much lower than actual selling price. 


Q4:  Share additional information about this piece of land that you find interesting. Many times, a link to the deed will be available providing more insight to the sale. There was nothing too interesting about this house, 2900 sq ft., hot tub and a fireplace. 

The second section to our lab was location decisions. The assignment was to locate an area for clients to by a house. They had 4 criteria they wanted addressed in the search, an area with a high concentration of homeowners vs. renters, high concentration of neighbors aged 40-49, close to their places of work, a hospital and a university. 

I created four maps that were then used to perform a weighted overlay. The first map I'm going to show has all four maps, two use Euclidean Distance to show distance from places of work. The other two maps show concentration by manipulating the attribute table. All maps were then converted to rasters and then reclassified to use in a weighted overlay model. Here is the first map:






The final map shows two panels. The first shows the four search criteria equally weighted. The second shows the distance at 15% weighted and the concentrations were weighted at 35%.



Sunday, July 9, 2017

Mod 7: Homeland Security, Protect

     This week was a follow up on last weeks Preparing MEDS. I used the data created last week to generate a surveillance and contingency plan for the Boston Marathon. The exercise consisted of two maps, one of hospital assets around the finish line and checkpoints 500 feet out from the finish line. The second map is of sites for proposed surveillance cameras around the finish line.

     The hospital map depicts the 10 closest hospitals to the finish line. This information was obtained by generating a near table. Once, we had a table, we sorted them by distance and took the 10 closest and created a shape file for these hospitals. The second part of this map was find ingress and egress routes around the finish line. I created a 500 foot buffer around the finish line. I used the intersect tool with the buffer and local streets. I created a shape file with the intersections. Here is the final map:



The second map depicts proposed sites for surveillance cameras to watch the finish line. This map is a little more interesting because I used LiDAR data to generate 3D maps. I used the XY  tool to generate points for cameras. From there, I could produce a viewshed map and then line of site for each camera in 2D and 3D. Here is my second map:



Sunday, July 2, 2017

Mod 6: Prepare MEDS

     This weeks assignment was to prepare MEDS (Minimum Essential Data Set) for the Boston Metro area that is to be used by the Department of Homeland Security. The purpose of the MEDS is to have a database ready to go in the event of a natural disaster or a security problem such as a terror attack. Having a database created and stored reduces the time to have good quality maps and plans generated in the event of an upset condition. I will give a screenshot of my database and explain its contents and processes to create it.


     There are 7 parts to this MEDS, boundaries, transportation, orthoimagery, elevation, hydrology,  geographic names, and land cover.
     First I needed to set a boundary for my study area. I used Boston and the surrounding towns as a starting point to create a buffer of 10 miles. The outer edge was the boundary for the study area. All   layers in the database were clipped to this shapefile.
     I created a transportation file, because the given data was quite extensive, I needed to simplify it. by grouping similar type roads together, I reduced the types of roads down to 3 types by using a search query in the attributes and created new feature classes. The new data files were primary, secondary and local roads. I changed the symbology on the new classes and concealed the layers at small scales to declutter the map.
     Next, I added the hydrography layer. There was no need to manipulate this data. It was already in good shape.
     I added a land cover file to the geodatabase. It was a large file and to reduce its weight, i used the extract by mask tool to reduce the raster extent down to the size of the boundary shape file. I changed the color scheme to the NLCD 2006 legend to be in line with other MEDS maps from around the country.
     I next added an orthoimagery and elevation DEM. Orthoimagery is just a fancy name for aerial images that are set to same coordinates as the other layers of the map. The DEM is a raster image that shows elevations.
     Lastly, I added a geographic names file. I started with a .txt file and had to transform it into a shapefile. I used the create feature class from XY table tool to do this. Once the shapefile is created, all of the points have attributes and can be labeled within layer properties.

Wednesday, June 28, 2017

Mod 6: Geoprocessing with Python

This weeks exercise was geoprocessing with Python. Our assignment was to write a script that uses 3 tools, AddXY, Buffer, and Dissolve. The AddXY tool wasn't covered in lecture so this was a good opportunity to go back over the help section in ArcMap to see how to write the script.  The other two tools were covered, so writing the script was straight forward. I liked this assignment because I can open up ArcMap and see my results after running the script. Here are two screenshots, the first one shows the results in the interactive window and the second shows the shapefiles in ArcMap.



Friday, June 23, 2017

Mod 5: DC Crime Mapping

This week we started the Homeland Security section of this course. I was required to turn in two maps of Washington DC. One being a proportional symbol map showing DC police stations that are in proximity to crime occurrences. I first had to use a xls file to create a shapefile to map the locations of the DC stations. Then i could create a multi-ring buffer and count the occurrences as they extended out from the police station. I performed a spatial join from the multi-ring buffer. From the spatial join, I could calculate the ratio of total crimes to crimes in proximity to police stations. Here is the final map:


The second map was a kernel density map to show the density of individual crimes that occurred within Washington DC. I used the kernel density tool for three crimes, burglary, homicide, and sex abuse crimes. I used a cell size of 73 and a search radius of 1500. Here is my second map: