March 11
Contents
- Editor's Note
- Wisconsin Land Information Association Conference
- DeLorme Products for Professionals
- The Deep Web and GIS
Editor's Note
Times Change. That's not news, I know. When GIS Monitor and the Ultimate Map/GIS Directory were launched back in 2000 the Web was a different place. Search engines were still maturing (they still are) and perhaps more importantly, Web surfers were still novices. Now, as we enter 2004, search engines are much better and we are all better users of them.
For that reason I think it's time to refocus the GIS Monitor website. The exploding growth of Web content means that a directory on nearly any topic, even a small directory, is never up-to-date. GIS Monitor's hundreds of pages have fallen prey to just that situation, and I feel there's little use in trying to catch up. The website has therefore morphed into a much smaller, topic-by-topic introduction to useful resources on the Web. I think these pages will be helpful for those just getting their bearings in GIS. Those who are more savvy, are ready to tackle the Web's many search tools to track their topics of interest.
A recent article in the Boston Globe supports my vision. It discusses how old style portals (think of Yahoo or Lycos) are redefining themselves. David Weinberger (a coauthor of The Cluetrain Manifesto, and Internet watcher) explains that "Portals were a table of contents view of the Web…Along came Google, which provided more of a back-of-the-book index view."
There's a second change on the website. For 3 ½ years I've posted press release-type news on the GIS Monitor website. First, I republished the releases, all available elsewhere, in their entirety. More recently, I boiled those documents down to a one or two sentence summary and collected a day's worth on a webpage. Those snippets appeared in each issue of GIS Monitor. Since January 1st, I have provided such tidbits only in the Week in Review section of this publication. Recall that press releases are what organizations want you to know or what must be disclosed by law (such as company filings, etc). What about the rest of the news? That's the material you'll read here each week.
GIS Monitor has been described as "going beyond the headlines." I can't recall if I said that or if someone else did, but I think it's fair and distinguishes the website and publication from others. That statement argues that my job is not to regurgitate what vendors, governments, or scientists have to say, but to put those statements in context for you. It also suggests that my job is to act as a filter to hold back the flood of information you face each day, and pass along just the relevant or interesting tidbits.
I want to do just that, daily, on the website. What you'll find in The Latest section on the main page of the website are paragraph summaries of the most important news of the day that put the topic in context. My hope is that will be enough information for you to (1) determine if the topic is worth your time, (2) read a summary that satisfies your interest or (3) explore more detailed information. The most important of these stories will continue to be gathered here each Thursday in the Points of Interest section.
Finally, I want to remind readers that every single GIS Monitor (since August 2000) is available online. I hope you find this archive a valuable resource; I know I do.
I hope you find these changes meet your needs. As always, I welcome your comments and suggestions.
Thanks for your continued support.
Adena
Wisconsin Land Information Association Conference
Last week the Wisconsin Land Information Association (WLIA) held its annual conference. I didn't get an official count of past conferences, but this appears to be the fifteenth edition or so. WLIA is a grassroots organization drawing out GIS and related technology users across the state. I met municipal, county, and state staff, surveyors, assessors, university faculty, students, and private sector people. Attendance was about 460. The first day included half-day, fee-based workshops covering topics such as an introduction to a free template for comprehensive planning, the use of an online application for ecological data, resources available for comprehensive planning, and GIS certification. There were also two free sessions (GIS in the classroom and an intro to GIS for local government officials). The workshops I attended focused on "Delivering GIS Functionality and Geospatial Data with Map Services" and "LiDAR Acquisition and Data Applications."
The Map Services session highlighted the importance of portals and provided a detailed demonstration of Geodata.gov, as well as a demonstration of ArcIMS and how to upload its metadata to a portal. I'm a bit concerned that paid workshops may turn into software demonstrations. At this point in the development of Web services, I'll suggest that workshops might focus on what services are and how to use them. That might be paired with the arguments to make content and capabilities available via such services and related issues such as privacy, access, payments, etc.
The LiDAR session was very technology-focused and prompted lots of questions, some of which the presenters were quick to say they could not answer, but would explore and get back to us. I was reminded that LiDAR at one level is "just another type of remote sensing" but like any other, the better a potential user understands how it works, the better the user can judge if it's the right tool for the job. Some "neat facts" I picked up:
- LiDAR can be flown at night since it provides its own "light" via a laser. That's in contrast to traditional photography that depends on sunlight.
- LiDAR can't "see through" clouds, therefore weather is an issue.
- LiDAR picks up two pieces of information for each signal sent out: (1) how far away the feature the pulse hits is from the plane, which is used to determine the feature's elevation and (2) the reflectivity of the feature (how much of the signal the feature returns) which can be used to identify some features. Water, for example, absorbs nearly all of the energy will appear black.
Over lunch, between the morning and afternoon workshops, I spoke about the problem of "too much information" in the geospatial marketplace.
The first full day of sessions began with a keynote by William Holland and Peter Thum of GeoAnalytics. I found the discussion of the organizational and management issues in the enterprise didn't tie into the rest of the conference. Perhaps that simply reflected the session choices I made during the conference.
A session by a nationally known content management company titled "Enterprise-wide Content Management/Imaging & GIS Solutions" focused on Legato Systems and ESRI software. I was pleased to find a packed room in attendance for Jason Nyberg's 20 minute (!) "Introduction to Digital Aerial Camera Systems and Uses." He made great use of time by cutting right to the chase about the differences between small/medium and large format offerings, and digital vs. film workflows. He told me later that sometimes he offers this session as a three-hour workshop, but I for one thought this "bottom line" quick overview was very effective. I'd encourage conference organizers to consider this type of focused technology overview for future conferences.
A final morning session titled "New Tools for Web GIS Development" was a demonstration of Taylor Technologies Rapid Integration Toolkit (RIT). I was encouraged that the session began with an overview of Web mapping offerings used in Wisconsin, including MapServer which was described as "free software" instead of the more correct "open source." The big news from Taylor Technologies is that the RIT which is essentially a "no programming required" development environment for MapGuide, will in time be available for ArcIMS.
The lunch and town meeting helped me understand the role of WLIA in the geospatial work of the state. The group tries to solve important issues, including for example, the fact that 60 or so county coordinate systems aren't supported by some GIS software products. Task forces take on other key issues as well, and submit "standards" for the state.
In the afternoon I attended a session on a county's move from MGE to ArcGIS which focused on the trials and tribulations of moving to the geodatabase. This clearly happens quite a lot, and even those moving from older ESRI systems will face some of the same issues raised by Waukesha county's discussion. I think attendees took home the important message that the move is not simple, but possible, and that the move is worth the effort. Hearing the story from those who actually went through it was refreshing after hearing a few too many vendors in the morning.
I was specifically invited to a session on the Wisconsin Land Information Systems (WLIS, pronounced "willis") Pilot Project. This group presentation spoke to the vision and history of a proposed statewide GIS portal. The Wisconsin Department of Administration (DOA), in 2002, requested that the Department of Natural Resources help implement a prototype of WLIS. DOA then awarded grants to two counties to begin implementations. The session highlighted the two counties' experiences and looked ahead to what was possible. The strongest illustration of the importance of such a resource was delivered in a "before vs. after" story. The scenario actually occurred: a farmer was spreading manure in inappropriate locations causing water contamination. The county GIS technician explained how he'd tackled the problem using ArcView and gathering data from the areas involved. Then, the same problem was tackled with the prototype portal.
For now, the prototype uses only ESRI software for the portal and those serving data, but the next pilot projects hope to address interoperability issues and include other technologies. The prototype team in Wisconsin, and other groups I've met, noted that funding for such work can be hard to come by.
I want to again recommend local, smaller conferences to anyone looking for networking and learning opportunities. The halls were full of discussion, all on topic. I literally heard no discussions of the Kalahari's indoor water park, which was connected to the conference venue. I do want to raise one concern. My sense is that these smaller events may depend more on vendors for presentations and workshops than larger ones. I encourage attendees (and potential attendees) to speak their minds on the role they want vendors to play in such events.
DeLorme Products for Professionals
Over the past few years DeLorme has begun offering mapping and GIS products outside the consumer realm. I recall the release of XMap, which I fear I confused with MapX and presumed was a developer tool. I was wrong; XMap family members are all end user products. (DeLorme does indeed offer developer solutions, too.) I also recall making a mental note when the company bought a LiDAR system and a digital sensor for its TopoBird data collection platform. (At right, a TopoBird shot of DeLorme Headquarters from April of last year at 15 cm resolution.) A few weeks ago the company released XMap/GIS Editor, its first full function GIS. But only this week, when the company announced a GPS post-processing solution did the bell go off in my head that this company was ready to serve the professional geospatial community.
So, I sort of hung my head in shame and called DeLorme to find out what I'd missed. Geoffrey Ives, Director of Professional Sales, explained that the XMap family was launched in 2000 and in the early years focused its support primarily on data available from DeLorme (a USA street database, quads, etc.). Users could bring in scanned files, geocode, and import shape files, but the idea was that most data was purchased or created via geocoding or GPS. XMap products include a tab, NetLink, that will allow direct download of data from the DeLorme library. Each successive release added more data type support. The recently released XMap/GIS Editor is aimed squarely at the GIS professional. XMap/GIS Editor supports an "organization's own" data, as Ives put it. By that he means the product imports and can edit shape files, DWG, E00, GML 2.0, etc. Moreover, XMap/GIS Editor supports thematic mapping, queries and other "high end" GIS functions. The price point: under $1,000. There's a table comparing XMap and XMap/GIS Editor functionality here. (That's a bit of interface, at left.)
This week's announcement of GPS PostPro 2.0 upped the ante again. For a few hundred dollars, DeLorme provides post-processing to get data down to sub-meter accuracy. (XMap owners and new purchasers get a discount on GPS PostPro.) The package includes an Earthmate GPS to use as a base station and the ability to tap into CORS or other online data feeds for correction. Typical cost for such a package from a traditional GPS vendor: about $3,500. While sub-meter data may not be appropriate for serious surveying, it's quite appropriate for assigning asset locations for GASB 34, rough utility measurements, and planning purposes.
And, DeLorme has not forgotten the Web. XMap Web serves out maps authored in several of the XMap products. An XMap Web 3.0 trial is included with the XMap/GIS, which provides a publish to Web "button." XMap Web 3.0's pricing is based on an annual fee plus the cost of XMap/GIS, and is OGC Web Map Service compliant, another nod to the professional user.
So how did DeLorme get into a position to offer professional products? Ives explained that the company's GPS technology has been behind three generations of receivers. At the same time, the company has been offering data sets and evolving its expertise there. Says Ives, "We believe that by making GPS technology more available and affordable for professional customers we will gain a loyal following of DeLorme professional users, who stand buy our products just like our consumer users do." (That's some public domain data, drawn in XMap/GIS Editor at right.)
DeLorme is a rather new player in the professional space. Still, it has created a compelling set of products at affordable prices. The company's success in the consumer space may well help it craft the right products for the professional space. I can't help but be reminded of Thales taking advantage of its consumer (Magellan) roots and weaving that expertise into its professional products. Look for something similar here, just at a different price point.
The Deep Web
Salon had an interesting article this week about Yahoo's recent decision to start a paid inclusion program and a related topic. (You may have to sit through some commercials to read the whole article; that's how Salon writers get paid.) The idea behind paid inclusion is that by paying one ensures results come up when users search using "GIS" for example. This is not what Google does with its right side ads that are labeled ads. In the Yahoo metaphor, the listings blend in with other unpaid ones. But, says a Yahoo rep, "If you pay, it has no input to where you're ranked." Ok, right, sure. Tell me again why I'm paying?
But Salon goes beyond that discussion and highlights another trend worth watching, one referred to as unearthing the "deep Web," that part hidden in databases and typically beyond regular search engine's reach. I have some examples from our field. Ever notice that while the text of some print articles in our field are online, there's no way to find them via a search engine? Why? They live in a database, one not accessible via Google or Yahoo. As a second example, consider that most people I speak to use Google as a primary means to find GIS data. Why? It can find things not listed in Federal Geographic Data Committee Clearinghouses or GeoData.gov or the Geography Network. Recall too, that data in those systems must be entered (at least once) by a human and hence the number of accessible documents is limited. Google sends out "bots" and has access to far more documents. On the other hand, Google doesn't know how to search the clearinghouses!
Now, there's good news and bad news here. The good news is that these "deep resources" typically have much better, much more structured metadata than the rest of the Web's documents. That means, hopefully, that once we have access to them, it'll be easier to find exactly what we want. More good news is that such databases can be opened up to search engines via application programming interfaces or Web services. The bad news is that for now search engines are not smart enough to tap into these treasures without "help." Still there's hope: Google can sort images (if only by name not content) and search PDF files. New players like BrightPlanet and Dipsie are finding ways to get into the deep Web.
I for one am hopeful all of our good work creating standard, effective spatial metadata will, in time, be accessible via many paths.
» Back to our GIS Monitor MAR 2004 Issue