Thursday, December 11, 2008
Sunday, November 30, 2008
Saturday, November 29, 2008
This is all in an generic handler (*.ashx). The Imports:
And the Class:
Public Class geo_content : Implements IHttpHandler
Public Sub ProcessRequest(ByVal context As HttpContext) Implements IHttpHandler.ProcessRequest
context.Response.ContentType = "text/xml"
context.Response.ContentEncoding = System.Text.Encoding.UTF8
context.Response.AddHeader("ContentType", "text/xml; charset=utf-8")
Dim xmlWriter As New XmlTextWriter(context.Response.Output)
'xmlWriter.WriteAttributeString("xmlns", "rdf", Nothing, "http://www.w3.org/1999/02/22-rdf-syntax-ns#")
xmlWriter.WriteAttributeString("xmlns", "geo", Nothing, "http://www.w3.org/2003/01/geo/wgs84_pos#")
xmlWriter.WriteElementString("description", "latitude and longitude from sql server 2008")
Dim con As SqlConnection = Create_Connection()
If con.State = ConnectionState.Closed Then
Dim geoReader As SqlDataReader = Create_Geo_reader(con)
Dim q As String = """"
If Not IsNothing(geoReader) Then
If geoReader.HasRows Then
Do While geoReader.Read
Dim geom As New SqlGeometry
geom = CType(geoReader("Geom_Data"), SqlGeometry)
Dim lat As Double = CType(geom.STY, Double)
Dim lng As Double = CType(geom.STX, Double)
Dim id As Integer = CType(geoReader("GeoID"), Integer)
Dim name As String = CType(geoReader("Name"), String)
'close item element
If con.State = ConnectionState.Open Then
Catch ex As Exception
Public ReadOnly Property IsReusable() As Boolean Implements IHttpHandler.IsReusable
Private Function Create_Connection() As SqlConnection
Dim Connection As New SqlConnection(System.Configuration.ConfigurationManager.ConnectionStrings("baseBSWstr").ConnectionString)
Catch ex As Exception
Private Function Create_Geo_reader(ByVal Connection As SqlConnection) As SqlDataReader
Dim programReader As SqlDataReader
Dim sqlStatement As String
Dim command As New SqlCommand
Dim dt As New DataTable("geograph_data")
sqlStatement = "SELECT * FROM geograph_data WHERE [GeoID] <> @negGeoID ORDER BY [Name]"
If Connection.State = ConnectionState.Closed Then
.CommandText = sqlStatement
.CommandType = CommandType.Text
.Connection = Connection
programReader = command.ExecuteReader
Catch ex As Exception
This starts by creating an xmltextwriter that is set to the context output stream. I add the xml namespace for geo. There is a separate function to create the connection string, and a separate function to create the sqldatareader. In this function you see a simple SELECT statement with parameters for the where value. This is returned and the geoReader starts to cycle through each row. A variable called geom is used to house the geometry, and you can see that it is of a type SqlGeometry. Using this type gave access to the STY and STX methods. Again, not the most elegant bit of code, but it works.
Thursday, November 27, 2008
Here is the code that I used. It is server side code, with the connection string stored in the web.config file.
Dim xDoc As New XmlDocument
Dim cNodes As XmlNodeList
cNodes = xDoc.GetElementsByTagName("marker")
Dim i As Integer = 0
Dim con As Data.SqlClient.SqlConnection
con = New Data.SqlClient.SqlConnection(ConfigurationManager.ConnectionStrings _
For Each XNode As XmlNode In cNodes
Dim ac As XmlAttributeCollection = XNode.Attributes
Dim lat As Double = CDbl(ac.Item(0).InnerText.ToString)
Dim lng As Double = CDbl(ac.Item(1).InnerText.ToString)
Dim name As String = CStr(ac.Item(2).InnerText.ToString)
Dim cmd As New SqlCommand("INSERT INTO geograph_data
VALUES (" & i & "," & "geometry::STGeomFromText('POINT
(" & lng & " " & lat & ")', 0), '" & name & "')", con)
'cmd.CommandText = "INSERT INTO geograph_data
VALUES (" & i & "," & "geometry::STGeomFromText('POINT
(" & lat & " " & lng & ")', 0), '" & name & "')"
i += 1
This was just run on my hard drive with a local instance, so I wasn't too concerned about security. From what I understand using parameters is meant to be more secure. Here is an example.
Now if you want to play around with sql server 2008 with just management studio here are some tutorials.
JasonFollas - this describes the difference between geometry and geography...pretty good series.
Tuesday, November 25, 2008
Unfortunately the book budget is a little low right now, as I've made a number of purchases recently so I don't own a copy. Speaking of which, I also recommend Bivand et al.'s spatial statistic book for R. This fills a void in Spatial Statistic books that has been growing. Most spatial stat books focus heavily on theoretical, which is fine, but for someone like me that is not in a course there is a lack of worked examples. Waller and Gotway's book is theoretical and has exercises, but now answer key so who knows if I produced the "correct" results....Andy Mitchell's book is a great start, but doesn't go into much depth (doesn't talk much about first and second order effects, doesn't talk about inhomgeneous k-functions, etc...)
Thursday, November 6, 2008
One part of MapInfo that is extremely powerful is the SQL functionality. It isn't comprehensive, but there are a number of things you can do with it. Where I'm currently living/working (The Northern Territory) has a number of communities that could have several different names for the same location. This comes from communities named by European settlers in the area, and Indigenous names. Basically, you can have a spreadsheet of data that you want to tie to a geographic location via the name of an organisation, but the spreadsheet names might not necessarily match the database names that have the latitude and longitude. So I want to find out which values in the spreadsheet are "missing" from the database. In other words, which values did not join.
Two Tables: sdss_geography and tutorial_sample_2
Two Columns: sdss_geography.Organisation_Name and tutorial_sample_2.School
The first step is to perform a Join on the two tables. Using Query --> SQL Select
Select Columns: *
From Tables: sdss_geography, Tutorial_Sample_2
Where: sdss_geography.Organisation_Name = Tutorial_Sample_2.School
Ordered By: School
into Table Named: InitialJoin
Next go to File --> Save Query and save the InitialJoin query as a table. Then close the query table and load the InitialJoin.tab table.
Once the table is loaded, go back to Query --> SQL Select.
Select Columns: *
From Tables: Tutorial_Sample_2
Where: not School in (select School from InitialJoin)
Ordered By: School
into Table Named:MissingJoin
This produces a table of missing values. I take this table and make sure the name and spelling matches the central database.
Hopefully that helps someone else.
Monday, November 3, 2008
It started through a great blog called Kelso's Corner. It's a cartography/visualizations oriented blog/website, but he posts examples of some great (mostly interactive) cartography. A lot of them, not surprisingly, come from the New York Times. Newspapers don't always produce the greatest or most appropriate maps, but the NYTimes seems to make the extra effort at doing this (The Economist is another one that does a great job). Anyway, I "discovered" that they have an interactive visualization creator called Visualization lab. They have a number of visualization techniques available for the user to create their own interactive visualization. According to this, it is based on technology from IBM Research. I didn't explore too deeply, so I don't know if the NYTimes site allows you to load your own data, but the Many Eyes does. Honestly, I'm a bit wary of these types of "map your own data" neogeography things. But I think the NYTimes site has done a great job of restricting users so that they make an appropriate map (e.g. a choropleth using derived data instead of raw counts). They are even using a good map projection for their world map data! I'm not too keen on the bubble visualization, because when I see it I expect the countries to be in the right place, so I find it hard to read. I guess I couldn't really find any order to the arrangement (hey I'm a Geographer, I look for spatial patterns automatically). Either way, I thought it was pretty well done, and fast.
Naturally when I see something like the Visualization lab, I immediately think "How can I do that?" Well the NYTimes has made it easier for me to try and make my own. They didn't release the code or anything, but they now have their own data API. One of the first that they have released is the campaign finance api. I believe this would be the same data used for creating this interactive map.
Wednesday, October 29, 2008
It is available here...Hopefully it will help someone.
Sunday, October 26, 2008
I've been trolling the technical forums a lot recently while learning more about Windows Presentation Foundation, which I'm using in a little project. I've noticed certain commonalities in all forums that bug me...which I will now list...just giving you enough opportunity to ignore this entry.
For one, certain forums award points to posters. Sometimes this leads to a prize of some sort poster. I suppose this is an incentive to get people to participate. What bugs me is when a poster new or old ends the post with "If this answers your question be sure to mark it as answered." I like the people who don't care about the points and just help answer the question.
I also don't like the posters who berate the question asker...especially if it is obvious English is a second language. It's just uncalled for. On the other hand, I don't like it when with the question asker writes "URGENT!!!!", or when they don't receive a response in an hour post "HELLO? NO ANSWER!!!!??" But I don't think it warrants berating the question asker.
Finally, I don't like the forum lecturers. These are the people that instead of helping or answering questions lecture the question asker or poster about how to post. Kind of like what I'm doing right now...no, no, it's different, really it's different. :)
Basically, I wish every forum could be like CartoTalk where the posters love the topic and helping people, and everyone is polite and friendly. I just don't get why some posters treat the question askers like they are waisting the poster's time. Don't volunteer your time if you're going to be a jerk about it.
Just my 2 cents.
Thursday, October 23, 2008
Basic CAD stuff like mirroring and filleting. The Copy Style tool is cool, and can be useful at times. This is a common feature in Word (AutoCAD has one as well).
There are some more sophisticated options such as splitting a polyline (exploding) to different line segments. There is a tool to add nodes at intervals on a segment. The Calculate angle and direction tools will be useful for more precise drawing.
I think the Create Lines and Create Polylines from database are interesting. Seems to be a way to store geographic data without using a spatial object...but with SQL server 2008 express this might not be necessary...
Basically they've added some nice little features that are part of a CAD application.
I found the tools a little cumbersome initially, but it didn't take long to figure them out.
I've complained about Named Views on here a few times, so with that in mind here is a feeble attempt to produce something "different." This is my first MapBasic Application, so feel free to improve upon it. Basically it adds to options to the Map Menu - Load Saved View and Save View. All it does is write the centre coordinates and zoom to a text file. Nothing fancy, but it might work for you.
For the source code.
For a compiled file (mbx, version 9.5).
Still I couldn't resist an experiment. I started a new workspace with one Map Window and one Layout Window. Saved the workspace, then added a piece of text to the layout, closed the layout window and saved as a new workspace. I compared the two in Notepad, and the layout was gone in the second workspace. I was secretly hoping that it just retained all the commands, but of course that really isn't efficient to save all the commands. Oh well, nothing is perfect. In the end, I shouldn't be trying to fit MapInfo into an AutoCAD ArcGIS mold, all three are different programs. It's useful to know the quirks so that I can compensate when using any of the programs.
Tuesday, October 21, 2008
Thursday, October 9, 2008
On the one hand, MapInfo is like a rock and doesn't change. On the other hand, they haven't fixed fatal flaws with their program (IMHO), like workspaces.
I'll post the tutorial (edited to remove gov' related info) when it is done.
Monday, September 29, 2008
Now I'm exporting ArcScene scenes to 2d images. I had originally done this on my laptop and got everything setup on there. There were problems exporting the image with background vector data (roads and coastline). The coastline was a polygon and came out with extra lines cutting across the 3d. So my work around was to export the vector and 3d as separate images and then make them as layers in PhotoShop Elements. This works pretty well. Now I'm working on the desktop and discovered that the images don't export the same size. The reason for this is the monitor size and resolution as the export dimensions are determined by this. You can also change the export dimensions by changing the height and width of ArcScene. This is a pain to duplicate, but at least now I know what the problem is.
This is the mesh that I came up with. Probably a good idea to save your sxd as something else before running the code. I also needed to close out and open back up before the settings took effect. The base heights disappeared and I'm not sure why this happened but I already had a bit of code to set the base height to the layer because I'm lazy and didn't want to go layer by layer setting the base heights.
Here is the code to set the source:
Private Sub SetSource()
Dim pDoc As ISxDocument: Set pDoc = ThisDocument
Dim pScene As IScene: Set pScene = pDoc.Scene
Dim pSG As ISceneGraph: Set pSG = pScene.SceneGraph
Dim pSV As ISceneViewer: Set pSV = pSG.ActiveViewer
Dim il As Integer
Dim pNewWorkspaceName As IWorkspaceName
Set pNewWorkspaceName = New WorkspaceName
.PathName = "D:\gamblingout"
.WorkspaceFactoryProgID = "esriDataSourcesRaster.RasterWorkspaceFactory.1"
For il = 1 To pScene.layerCount - 1
Dim pLayer As ILayer: Set pLayer = pScene.Layer(il)
Dim pRLayer As IRasterLayer: Set pRLayer = pLayer
Dim pDataLayer2 As IDataLayer2: Set pDataLayer2 = pRLayer
Dim pDatasetName As IDatasetName
Set pDatasetName = pDataLayer2.DataSourceName
Set pDatasetName.WorkspaceName = pNewWorkspaceName
pDataLayer2.DataSourceName = pDatasetName
You will probably need to change the start layer, as I had it set as 1 instead of 0.
Here is the code for the base heights.
Public Sub Set3d()
Dim pDoc As ISxDocument: Set pDoc = ThisDocument
Dim pScene As IScene: Set pScene = pDoc.Scene
Dim il As Integer
For il = 1 To pScene.layerCount
Dim pLayer As ILayer: Set pLayer = pScene.Layer(il)
Dim pLayerExt As ILayerExtensions: Set pLayerExt = pLayer
Dim p3dProps As I3DProperties
Dim i As Integer
' look for 3D properties of layer:
For i = 0 To pLayerExt.ExtensionCount - 1
If TypeOf pLayerExt.Extension(i) Is I3DProperties Then
Set p3dProps = pLayerExt.Extension(i)
Dim pSurf As IRasterSurface
Dim pBands As IRasterBandCollection
Dim pRasterLayer As IRasterLayer
Set pRasterLayer = pLayer
p3dProps.BaseOption = esriBaseSurface
Set pSurf = New RasterSurface
Set pBands = pRasterLayer.Raster
pSurf.RasterBand = pBands.Item(0)
Set p3dProps.BaseSurface = pSurf
There are no checks to make sure the layer is raster and not a feature layer. This is one area that could be expanded. Since it is for a fairly custom file, I know which layers are what type.
Wednesday, September 17, 2008
The Flow Map Layout tool that comes with the article is actually pretty slick, at least when using the supplied examples. Instructions are lacking unfortunately, so I'm not sure how to use multiple root points. The output is nice, and allows for export to an eps file. That could be brought into Adobe Illustrator and edited. Ad ambitious as I can be, I'm thinking of creating something similar for ArcGIS but using a shapefile. This would give a lot of control over the final product. In their tool, you can move things around too...Anyway, there are limits and I'm a control freak.
Camel project finished up as much as that type of project can finish. It's one of those projects that you can keep adding to and making it more sophisticated. It was part of a larger part, but it sounds like this will also be produced as a separate GIS/Model report. See how that goes. It was an interesting and fun project to work on. Good people too.
Tuesday, September 16, 2008
Saturday, August 16, 2008
Anyway, that's my rant of the day...
Thursday, August 7, 2008
Tuesday, August 5, 2008
The information business is being transformed by the Internet into the sheer noise of of a hundred million bloggers all simultaneously talking about themselves.Andrew Keen - Cult of The Amateur
I was led to that quote in a round about manor via Jame's Fee's blog to an entry by Sean Gorman. There is an interesting discussion on the tensions between the Geoweb and the GIS industry (ESRI, MapInfo). In short, the GIS industry apparently considers GIS is what professionals do and the Geoweb is for amateurs. I don't really agree with this, but don't see too much of a problem with the terms used. I do agree that the Geoweb is not GIS, at least not at this point. I see a little danger with people who do not understand fundamentals about maps creating maps, but you get this even with so-called "professionals." Plus, the people, generally speaking, who build the Geoweb do have an understanding. I see GIS and the Geoweb on the same spectrum supplying different services around spatial data, they aren't the same but aren't too different. People who do Geoweb are professionals at Geoweb, and I'm definitely an amateur. I'd like to think of myself as a professional at GIS, and a person who does Geoweb probably would be an amateur at it. Everybody is a little defensive (me too) about their industry, and a little protectionist, for example just tell a Geoweb zealot that it isn't GIS. When I say it, it isn't meant to be an insult. There's just a lot of capabilities that I have in a GIS that isn't there in the Geoweb, just like there is a lot of capabilities that isn't in a GIS. The problem with ESRI is that they want their software to do everything. It's not practical or possible. I agree with James Fee's sentiment about all getting along, after all we are all under the same Geography hat. Sean Gorman had a link to the Cult of The Amateur book which had an excerpt you could read. I found the bits that I read of this book interesting. I don't agree with his points entirely. In the end the web is a bit of a quasi-market, and the users gravitate to the best sites, best blogs, best news organizations that meet their criteria. If a film review blog is filled with boring analysis and misspellings, then it is not going to be popular and not overshadow someone like Roger Ebert. This blog is proof positive that people avoid crap (except for 7-10 people one of which is my Mom). Even Wikipedia has had to tone down their encyclopedia for everyone to edit and now use "experts" to do the writing or monitoring. I do think the inclusion of a comments section on news articles is ridiculous, and although it is sometimes entertaining, they are usually filled with vulgar bigoted remarks. I forgot who's axiom it was about Usenets/newsgroups but if they go on for long enough eventually someone will start to call someone else Hitler or a Nazi. That is perhaps where the Web's greatest enemy lies is the anonymity. Which makes it difficult to tell where certain information comes from, but unlike Keen I'm pretty sure Web 2.0 users can tell that a blog is going to be a form of an OpEd piece, and if you visit the Ford blog you can expect propaganda. I think perhaps the greatest flaw in Keen's argument is that the "quality of public civil discourse" has been declining since before Web 2.0. Personally I point the finger at the political pundits like Rush Limbaugh, Anne Coultour, Michael Moore, Al Franken, and on and on; who leave little room for "civil discourse" and bifurcate an already bifurcated system. Anyway, that's way off topic and probably not the type of info you would read a GIS blog for. So I apologize, but I've typed it and it would take too much effort to remove it.
Wednesday, July 30, 2008
This is a terrible example that they are using. If I'm not mistaken, bodies of water are treated as a single elevation at the water surface (unless showing bathymetric contours), therefore showing contours cutting off at the edge of the water surface is totally inappropriate. These contours should go around the pond not through it!
On a side note. The Mapping center does have some useful information. I really like all the Historic Map symbols they created.
Monday, July 21, 2008
The camel project finished up shortly before the trip. I was a little nervous but it seemed to work out. In the least, it gives some base on which to develop a more robust decision support system for camel management. The difficulty that I ran into was because everything was tied to access (i.e. roads), most of the management plans overlapped. There were ways around it though, for example doing a cost distance surface from the abattoir limited camel regions to how close they were to the abattoir along the road. I actually used the cost distance surfaces most of the time. Instead of using point locations for say boreholes (necessary for transporting the camels because of the amount of water they require) I used those as starting locations for a cost-distance surface and the roads a friction surface. I also used the major roads as a starting place, with tracks and 4wd roads as a friction surface. This created an implicit cost (cost as in the cost of implementing a camel management action) where it was less money to implement the plan on a major road (easier access) than on a track and then of course off road was the most costly. I originally based aerial culling off of roads, but then switched that to community base, which gave a different range....It was an interesting project. Difficult due to the time constraints, though.
Finishing an article based on some school choice analysis I worked on in 2007. Hopefully it will be accepted for a special edition on mapping school choice. I used kernel density estimation to explore some of the changes in distribution across the school district. Also calculated network distances to see how distance from home to school has changed.
Working on another article using kernel density to calculate a segregation index. It is based on the methodology in this article by O'Sullivan and Wong, A Surface-Based Approach to Measuring Spatial Segregation. I'm not entirely sure if this will actually happen, but I'm experimenting a little bit.
Monday, June 30, 2008
I tried to use the MapCad tools but, honestly, couldn't find how to load them. I ran the installer, twice, but nothing happened. I looked in the helpfile for MapCad and it said to go to the Help dropdown and go to MapCad Help. This, of course, did not exist. So again I ask the question, why spend over a 1000$ when you can get the same functionality out of Manifold (plus both the extensions for that amount)? It will supposedly also work with MSSQL 2008 spatial as well.
I didn't bother playing with the .NET features...
Friday, June 27, 2008
Anyway, all the talk around 9.3 has overshadowed the pending release of MapInfo 9.5. Its fairly obvious from my posts that I am primarily an ESRI user, and have given a fair share of MapInfo criticism. They must have heard the call (I'm certain I wield such influence), because it sounds like they have made some substantial changes. They are a little behind the times on some things, like the .NET MapBasic features. ESRI has had that for over a year now. I'm not too impressed by having support for MSSQL, because it sounds like everyone will. Although, I bet MapInfo won't require any extra extension to handle it. I am hopeful but very much doubt that there will be direct support in ArcGIS 9.3 without the need for ArcSDE. MapInfo should also add PostGIS support in my opinion. It seems to be gaining some commercial acceptance through Manifold, ArcSDE, zigGIS (also ArcGIS), etc...
I'm downloading an evaluation copy of 9.5. I can't test everything, but I read that they designed the interface with .NET, so I'm hoping it doesn't look like crayon on sandpaper anymore. I also hope they replaced workspaces with something more functional. I'll post my findings here.
Friday, June 20, 2008
Wednesday, June 18, 2008
One other part struck me too, was when Juliette Binoche's character is asked if she knows anyone from a dying soldier's hometown. I think it is later in the movie that she questions this action. The desire to see someone from where you come from. This probably interested me more than normal, as I am now living abroad. We just went to a party of Americans living abroad. The only reason for the party was that we were Americans, or spouses and children of Americans. So that was the one connection - you're American. There was even a hierarchical scale to the connection - America --> State --> Close to the same town....It's just funny, because without the concept of place, it's not like we would have ever met or been friends with any of these people, especially in the states. We only had one thing in common. This happened when I traveled too. It's such an easy introduction - WHERE are you from? As if Where defines who you are. I would probably make the argument it does partly define you.
I noticed that Harm de Blij has a new book coming out this winter/summer (july), and I think it will be about this topic. From the product description:
"In recent years a spate of books and articles have argued that the world today is so mobile, so interconnected and so integrated that it is, in one prominent assessment, flat. But as Harm de Blij contends in The Power of Place, geography continues to hold billions of people in an unrelenting grip. We are all born into natural and cultural environments that shape what we become, individually and collectively. From our "mother tongue" to our father's faith, from medical risks to natural hazards, where we start our journey has much to do with our destiny, and thus with our chances of overcoming the obstacles in our way."I find it to be interesting, but I can only offer anecdotal experience as my view. I also find Geography in the movies to be interesting. A recent AAG newsletter had a short article about this. I think it was just before the Boston Conference, because all the movies were about/took place in Boston.
Forgot the book came out shortly before the movie, but still in the early 90's. Although similar themes between the book and movie exist, I think the movie had a few of its own. Anyway, here are some articles on the book here and here.
Sunday, June 15, 2008
I see with Google Maps why it is necessary to use Lat and Long, because it just makes it easier for Google if people conform to a standard Latitude and Longitude. Projections or the lack there of can greatly influence people's perceptions of the world. Look at the Mercator projection and Greenland. I think this is related to Dr. Parks' talk as well. She says people don't have an understanding of satellites, but they also don't have any understanding of projections.
I can't tell if Google Earth is a culprit of this as well, because I'm not sure if it uses a Geoid or some other "true" representation of the Earth.
Friday, June 13, 2008
Tuesday, June 10, 2008
Monday, June 9, 2008
I also have managed to use Raster Calculator to duplicate the Fuzzy tool in IDRISI. It is a duplicate of the Linear function, and there are two ways to calculate it. One for monotonically increasing, and the other for monotonically decreasing. The formula is for 0 to 255.
Con is a map algebra function that means conditional. Con(test, true, false).
Here is how it works for monotonically decreasing:
INT((Con([RasterToBeScaled] <> 900, 0,( ((900 - [RasterToBeScaled]) * 255) / 900) + 0))) + .5)
Con([RasterToBeScaled] <> 900, 0 if the cell value is greater than 900 give this cell a value of 0. If it is false then calculate the value
( ((900 - [RasterToBeScaled]) * 255) / 900) + 0)))
INT and +.5 rounds the values properly and removes the decimal places.
The range of this example is from 0 to 900, so you need to change those values depending on what your "maximum and minimum" values are going to be. You really divide 900 - [RasterToBeScaled] by the range (max - min raster values). Since my range was 0 to 900 I skipped this and put 900.
If you need this to increase then you have to change some of it around.
INT((Con([RasterToBeScaled] <> 900, 255,( (([RasterToBeScaled] - 0) * 255) / 900) + 0))) + .5)
Now, whatever is less than zero is zero and greater than 900 is 255. Everything in between is calculated using the formula:
( (([RasterToBeScaled] - 0) * 255) / 900) + 0))
So now we subtract the cell value by the lowest value in your range and everything else is the same.
This is similar, but not quite what I was looking for.
Hope that helps...
Wednesday, June 4, 2008
Here is the help file that I found. Look under the Using a multivalue input in a script section.
Another problem I have with Python mostly comes from ArcGIS, which is the only time I ever use Python. Well, most of programming revolves around Spatial Software in some way. Everytime I try and do something with the geoprocessing scripts, I seem to run into a bug. I wrote a script once that exported coverages from all the featureclasses in a workspace. This script stopped to function from the toolbox with service pack 3 (9.2). I think this is working again with service pack 4. Also, apparently the reset method on a lot of the cursors do not work, so forget about reusing a cursor. Seems like every time I need to write a little geoprocessing script I need to try and figure out a workaround because I find another problem. Granted though, I've had some success too.
On a completely different note:
I was also updating my Antivirus and saw that one way to help them was add a link to their site. I use the ClamWin open source antivirus program. I've probably used it for maybe a year, and have been happy with it. It isn't the beast of a software that Norton is. I can only vouch for it on a personal computer, so I don't know how it would work at an "enterprise" level. I haven't had any viruses, but then again, I didn't have any before I installed it either...knock on cyber wood.
Tuesday, June 3, 2008
Obviously, communication can be twisted (see How to Lie with Maps). I think it is even easier to lie with satellite imagery. If a satellite image is shown in a presentation and you are told it is of a certain location at a certain time, you just have to accept or not accept that as the truth.
The link above was mostly critical of her critique of the Darfur project. Again, following my explanation, then the point is that the unbiased imagery begins to serve a biased purpose (or second biased purpose if you count the original acquisition), that of stopping the Darfur crisis. I'm not arguing against or criticizing the Darfur project, but recognizing that there is a goal behind showing the geographic information...
Anyway, when it comes to working in GIS, I use it as a tool, and I'm very much a pragmatic post-positivist (if I may mix epistemologies). I wouldn't be surprised if we start to see more conversations like this popping up, especially with more and more geographic information becoming readily available, and programs like Google Earth gaining in popularity...
Sunday, June 1, 2008
I hope to do a lesson every couple of days. Starting with lesson 8, there will probably be a total of 2 or 3 lessons that build a little project.
Hope you enjoy.
Wednesday, May 28, 2008
Please visit the site here.
Please post your comments about the lessons at this post.
Hope you enjoy.
Thursday, May 15, 2008
Wednesday, May 7, 2008
There have been a number of posts about working with MapInfo. You may also be interested in them.
Let's get one thing straight - I hate working with MapInfo. I know a lot of people love it. At least one person has said, "I used to work with ArcView, and then I tried MapInfo and never switched back." I can understand this if maybe you worked with ArcView 3.2 (which I hate working with too), but the newer version of ArcGIS is a different program. At some point, as a software company, you have to wake up and stop building on top of your legacy programs. AutoCAD has done this(at least twice), ESRI did this, Microsoft does this annoyingly (office 2007). Yes this creates bugs and in some cases a slightly less stable program, but the capabilities and improvements, in my opinion, outway the negatives. Now, there are good ways and bad ways to handle this. Autodesk handled it well, simultaneously releasing versions of their old land desktop and new civil 3d. Increasing the stability of civil 3D with each release, and eventually phasing out the old product. The temptation is not to learn the new product because it is easier to work in the old product. Microsoft handles this badly, because they just give you something new and different. I've heard it estimated it takes at least a full month of using the new version of office regularly in order to get back to your level. If you are in a workplace, when do you have time? I'm sure ESRI didn't phase well, but this was before my time. My main question is why would anyone pay 3000$ for MapInfo when you get essentially the same product for 300$ with Manifold GIS (complete with its own annoyances)?
I will add this caveat though - I come from an AutoCAD ArcGIS background. Obviously I am biased, and I'm probably trying to force Arc methods on MapInfo tools. So if my tips point to something I'm doing wrong, hopefully I will be corrected. I appreciate any additions, corrections, complaints. I've spent enough time ranting, time to move onto some tips. My focus is on Layouts, Maps, and Labels.
1. Use workspaces. You are forced to use workspaces, so might as well use what you are given. Save often, too. This is a no brainer, but it was a struggle for me to get over how crappy workspaces are. Be sure to close all tables before opening another workspace, or you'll merge two workspaces together. Workspaces are not like a drawing in AutoCAD, or a Map in Manifold, or an Mxd in Arc; these are static saves of your "project". A workspace is the closest thing to a project MapInfo offers. If you happen to close a layout, or a Map Browser without saving a workspace, then you have lost this information. That doesn't happen in other "normal" software. If you couldn't tell, I've made this mistake several times.
2. Add the named views tool to your program (Tools --> Tool Manager --> Named Views), and use it. Let me explain what I mean by views: a view on a layout is a frame that looks out onto your world. In most cases the view looks out onto a Map Browser. If you don't use named views, then it is difficult to maintain the same view in a frame on a layout. As soon as you zoom or pan in a map browser, this changes the associated views in your layout. To get back to where you were there is previous zoom, but that is lost if you move 2X. A named view is really your only way (as far as I can tell) to retrieve a view. Named Views would be similar to a bookmark in ArcGIS, or a view in Manifold. There is no need for views in AutoCAD because it is intelligent enough to remember your view regardless of what you do in modelspace (data view, map browser). I wish GIS systems would do this as well, but they don't.
The main problem with named views is a view is saved at the program level. This means it is not saved in the workspace. Also, it is only saved at the program level if you shutdown properly. If your MapInfo crashes, all the named views are lost since your last shutdown, even if you've saved your workspace. If you have two versions of MapInfo open, named views from the first program you shut down are lost. If someone opens up a workspace, they will need to create their own views. Please Please someone tell me I'm an idiot, and there actually is something better than this! (or just tell me I'm an idiot if you prefer, but I'd prefer something more constructive).
3. Layouts and views. This isn't a tip, but a warning. You can't double click on a view and access the map browser. This is just an irritation if you've used Arc or CAD. You just need to switch back to the map browser. It's a slight loss of productivity, but doesn't matter to much.
4. Labeling is pretty good in MapInfo. It's sort of a combination of automatic labeling and manual labeling. That really makes labeling fairly easy, because you can turn the labels on, and then move them around. The nice thing is they don't loose their associativity with the layer, so you can turn them off. You can also change the style en masse too, which is handy. When moving labels, it behoves you to first set up your layout and views(frames) with a chosen scale. This, of course, changes your map browser scale. Create a named view in this map browser view, so you can get back to it later. Then change the map browser scale to the same as the layout view scale. Now move the labels. Go back to the named view when done, and check out the results in the layout. If you don't do this, and edit the labels with the named view scale, they will suffer from "5 year old needing to pee" syndrome and will bounce all over the place. This syndrome also happens in Arc on occasion (in particular with the dynamic scale bar in layout view). Here is an example. Add a frame to a layout and select a map browser. This is your view of the map browser. When you double-click on the frame/view, you can set the scale (1 cm = 20 km). When you go to your map browser, the scale will be set as 1cm = 11km (or something like that). Save the named view, then change the scale to 1cm = 20km. Move the labels. Go back to the saved view (this changes the scale back to 1cm = 11km). The labels should look correct on the layout view.
If you're not confused, then I haven't done my job.
Wednesday, April 30, 2008
Thanks to an anonymous poster below it was brought to my attention that the chart in the description info window does not load in Google Earth 4.3. At first I thought it was just a very slow load time, but then it doesn't ever load. After some experimenting, I discovered for some reason when I add the labels to the chart, the image never loads. If I remove the following line from the url then it loads fine: &chl=MaleFemale. I haven't the slightest idea why it does this, and afterall GE 4.3 is still in beta (what of Google's isn't still in beta?)
If you want to see this in action, create an empty text file with .kml instead of .txt for the extension. Paste the following kml in the file and then save it. The image shouldn't load in the info window. Then remove the &chl=MaleFemale, and save. Now the image should load.
<?xml version="1.0" encoding="UTF-8"?>
<description><![CDATA[<img src="http://chart.apis.google.com/chart?chs=250x100&chd=t:59,41&cht=p3&chf=bg,s,65432100&chl=Male|Female"> ]]></description>
Tuesday, April 29, 2008
So I've started work on another project. I am a little unsure of confidentiality, so I won't list all that are involved. I'm sure the project will be made public, or at least the publication will be. It probably wouldn't be too difficult to guess who this work is for if you are in Australia. I'll be modeling camel management plans...while it is tempting (and easier) to just turn in a set of photographs of myself shooting camels, I'm of course referring to GIS-based models. Since the turn around time on this is pretty quick (2 months!), the model is a fairly simple Multi-Criteria Evaluation. I'm planning on using IDRISI (Andes?) to do this, but in the back of my head I'm thinking of writing a plug in for MapWindow. Two months isn't long, and IDRISI has a number of built-in tools for performing a Multi-Criteria Evaluation, as well processing rasters (distance surfaces, friction surfaces, etc...). The reason I want an open source solution is to create a user interface that could be used by anybody. That way a land manager could come in, and given a set of criteria (most-likely predefined), they could spit out a map showing potential locations for different camel management plans.
As idyllic as camels look in the Australian desert, they aren't native species. They were brought in as pack animals, and in many cases the train routes and road routes actually follow the same route as the old camel trains. Now their population is approaching one million, and they can be quite destructive to infrastructure, and biodiversity (or vice versa). Probably the unique part of this model is it is trying to identify locations for management plans based on a perspective, and also including perspectives in the model. Obviously, when a question is asked it comes from a certain perspective, and to answer that question certain criteria will be relevant. On top of that, there will be a layer that explicitly shows where certain management plans cannot be implemented based on the community, land owner, etc... This will most like be a constraint factor...but also could be a distance surface I suppose...Have to think about that one.
I'll keep you up-to-date on it.
Friday, April 25, 2008
I think Australia, and probably all of the commonwealth, gives geography a higher place in education than Geography has in the United States. Let's face it, we Americans really aren't geographic people. The Geographic Literacy survey shows that. I think the downfall of Geography is well documented in the American Geography community. From the loss of Geography in Ivy League Universities to the advent of social studies, Geography lost its ground in the 20th century. Take a look at Why Geography Matters for a good read. It's coming back though; the American Association of Geographers is reporting record membership and conference attendance numbers, Harvard has even made steps to reintroduce Geography. I'm sure we all know that this is partially due to the Google Maps and Google Earths. These technology have made Geography extremely accessible.
Moving too Australia has definitely been a good career move for me.
Thursday, April 10, 2008
Well, I think I'm off Google Earth for a while. I was focused pretty heavily on it for this presentation. In my spare time I'm working on a Box Shaped World tool set. The focus is mostly on tools that I need, weighing heavily on spatial statistics. I have a Ripley's K that is functioning quite well, and I think I've got a basic kernel density tool built. All the tools are built in VB.NET, for MapWindow 4.4 (and presumably 4.5 when that is released). I chose MapWindow for my project for a variety of reasons: supports .NET (the only programming language I know), has an editing environment for shapefiles (unfortunately it only supports shapefiles for vectors, but shapefiles are a universal format), extensive raster/grid support, and an apparent plan for future directions. Plus it is free! My tools will be free as well, but I don't think I'll release source code, at least not initially. I thought about sharpmap, but didn't go with it. I like sharpmap and hope it continues along its path. I didn't want to program a GIS interface, and there isn't much editing support from what I could gather. I guess (this may or may not be true) my impression is the MapWindow folk seemed to have it a little more together, but I'm not involved in the development process and I appreciate their efforts and time. Sharpmap seems to be completely volunteer based, too, where MapWindow is University based...Again, I like sharpmap and hope to use it in the future. Anyway, my tools will advance as I have time for them...
Wednesday, April 9, 2008
I started by downloading some 2000 census data in census block format, along with SF1 data, from ESRI's free download area. Then I brought it into Manifold selecting just a few census blocks for all of Fort Collins, Colorado, and created their inner centroids. I filtered some of the data that had 0 population from the SF1 table. Clearly, when/if you open this up you will find that I should have weeded a few more points, but on the other hand it shows how quick this runs. My server is pretty slow, and I tested this on a wireless internet connection that didn't have a very strong connection (plus I was streaming both Google Earth and some music). In Manifold, I related the two tables and exported it back out to a shapefile. On export I added two columns for lat and long.
In Visual Web Developer Express I created a generic handler (ashx). Instead of going into each line of code, I'll just summariz(s)e. Below is a link to a zip file where you can download both the KML file and the ASHX file. I used Sharpmap 0.9 to connect to the shapefile and read the table's contents. I just took the percentage of Males and Females (number of males / population, number of females / population) for my working statistic (truly groundbreaking stuff :)). You can see from the code that it basically just writes out the KML file, creating a Google Chart api url as it goes.
You can view the final product by opening this KML file in Google Earth. This KML file contains very little KML, and basically just has a Networklink back to the address of the handler.
If you want the files used, they are available here.
As always, any suggestions, comments, questions are much appreciated.
Here are some other examples I've found:
Just noticed the Thematic Mapping Blog also has a post on using charts in Google Earth. Here is the link: http://blog.thematicmapping.org/2008/04/using-google-charts-with-kml.html
or with open layers http://blog.thematicmapping.org/2008/04/openlayers-and-google-chart-mashup.html
Tuesday, April 8, 2008
Of course I'm not the first or last person to do this dynamic kmz creation with ASP.NET, and a brief search will produce essentially the same process using #ziplib. Although, I don't think many people or dynamically creating graphs at the same time (could be wrong, if so please correct). I really want to use Google charts to do this, because they have some nice looking charts, but due to some proxy server issues this became more hastle than it was worth. I think if I do a tutorial or something, I would use Google charts instead. Again this has probably been done somewhere.
Here is a fix to a problem I was running into. You need to set the size of the file before adding it as an entry: http://community.sharpdevelop.net/forums/p/6986/19897.aspx#19897
Here is a copy of the code I am using:
Private Sub ZipFolder(ByVal currentFolder As String)
Dim Filenames() As String = System.IO.Directory.GetFiles(OutputPath & currentFolder & "\")
Dim s As New ZipOutputStream(System.IO.File.Create(OutputPath & currentFolder & ".kmz"))
Dim buffer() As Byte
For Each file As String In Filenames
Dim entry As ZipEntry
Dim fileInfo As New System.IO.FileInfo(file)
entry = New ZipEntry(fileInfo.Name)
entry.Size = fileInfo.Length
Dim fs As System.IO.FileStream = fileInfo.OpenRead
Dim sourceBytes As Integer = 1
Do Until (sourceBytes <= 0)
sourceBytes = fs.Read(buffer, 0, fileInfo.Length)
s.Write(buffer, 0, sourceBytes)
fs = Nothing
sourceBytes = Nothing
entry = Nothing
Filenames = Nothing
s = Nothing
buffer = Nothing
Monday, March 31, 2008
The current project I'm working on is for the Department of Employment, Education, and Training....and water sports. The NT government has some good department names. I'm under the education department and there is currently a push to be able geolocate all the schools and associate this with their excellent data warehouse. The Northern Territory has some very remote communities (some still are lacking in power/electricity). I'm sure to fellow GISers it is obvious the need to be able to locate where your resources are going, and not going. A map is a simple way to do this. I'm the only GIS user working there and only on a part time basis at that, so it is important for everyone to be able to view the data when I'm not available to create a map. Their data warehouse is already running on SQL Server, and so I've been building my system on top of that (or next to). MapInfo was chosen as the GIS, and I'll describe my woes with this program later. I got into Google Earth, because another department had set up their on Google Earth server and had a site license for GE Enterprise. That's when I started to explore what we could do with it.
This is what I've come up with. I'm sure I'm not the first person to do this, but I thought I would at least share my methodology.
KML has a special section called network link. This allows the KML creator to instead of populating a file with the coordinates and geometry, to tie it to a web based server side script. This was extremely easy to do. I created a generic handler (ashx) in asp.net, and wrote a small class that created the response in the kml/xml format. The class pulled all of the coordinate information directly from the database server. Pretty cool to have the dynamic capability built into Google Earth. and only requiring a little server side scripting. Maybe 50 lines of code if that.
That's good, but it is just the location of schools. Obviously the description for the info window that pops up when the location is clicked can be populated with various data from the warehouse, but this might become a bit tedious to code. Instead I just placed a link to a profile page I had already set up. Unfortunately, you can't place an iframe in the window, because it isn't sophisticated enough to handle every scenario. No big deal.
What I really want to do is be able to display the data as a symbol. I could easily classify the symbols using some sort of nominal data stored in the database, or even some continuous data. I had originally created some maps with pie chart symbols in MapInfo showing the break-up of enrollment. I was asked if there was a way to click on the map and see what the numbers actually were. Well, this was a PDF, so the answer in that case was "no." I started to think if there was a way to do this with Google Earth, that would allow for the creation of a much more interactive environment. Using GDI+ and .NET, I was able to create a png image of a pie chart from the data in the warehouse, while programatically creating the kml file. Originally, I manually took this data and created a KMZ file. Then I started to use network links and tied my pie chart code in with my server side script. Now I have a dynamic kml file that creates pie charts on the fly as it is shown in Google Earth (images stored on the webserver). Now whenever it is opened up, it will reflect the current changes in the data warehouse. I plan on achieving the same effect with a proportional symbol. .NET has assemblies that allow me to compress data, so I could easily create a fixed kmz file based on this same method.
I would also like to do something similar using 3D. I attempted to create an extruded polygon based on the data. Unfortunately, these do not change their scale based on zoom. I looked around, but did not find anything. I'm thinking I could do this using regions. But at this point, I'm a little confused how regions work.
Anyway, it isn't difficult with a little programing to create a highly interactive map with Google Earth and ASP.NET.