Awesome news from the Google Geo Developers Blog. Can’t wait to see the spring, summer, autumn and winter maps that will be created. And I even know someone who always wanted Heavy Metal maps – no more need to wait for this to happen. Dive into the Google Maps API V3 and make it happen.
The latest post on the Google Lat Long Blog presents a neat new way to visualise big piles of data on a map. Pretty cool that they chose an example with bike trails to explain what can be done with Google Fusion Tables. The MTBGuru blog has more details along with maps and screenshots on their site. I think that we will give this a try at work and see if it may be an alternative to a more traditional clustering approach that we are using for visualising many thousand trails on a Google Map at the moment.
One problem I see with the approach of simply drawing all the tracks on a map is that there are many sections, where you will have overlapping tracks. Most of the tracks are GPS tracks that are not snapped to a network and you end up with tracks on top of tracks on top of tracks. A spaghetti trail map. If the trails were modelled as a network, then it would be cool though if you could just click on a part of the track network and have suggested tours come up that pass the section of the network you just clicked on.
Love My Maps? I love „your maps“ 😉 but besides the excitement of announcements, such as the one about adding editing functionality to the Google Maps API, which was made on the Official Google Maps API Blog this morning, some kind of frustration also comes along with it. Why? How many developers do you think are solving the same problems again and again? We have come to know Google as a company, which does not like to let people know too much about what’s next. Fair enough you may say – maybe you have to stick to these rules for surviving in a market, which is increasingly showing signs and giving rise to speculations about becoming disrupted or changing from a blue ocean to a red ocean or for the Google supertanker. I guess Google really has transformed into a supertanker in the last few years.
By the way, the addition of editing functionality to the Google Maps API was just one of several „new Google Maps things“ in the last week. We have now got a Google Map Flash API, new user functions and content to explore and discover the world around and far away 😉 or the new real estate options.
So why is there also frustration besides the excitement? While some people certainly like to „prototype-(p)re-invent“ wheels, others specifically DO NOT like to. Especially for smaller businesses, which use Google Maps or Google Maps Enterprise as a tool for making some kind of business idea work, it can be really frustrating to spend time and resources on developments that can literally often go in the bin a few weeks later – when the exact announcement that you were waiting for some weeks ago, is finally made.
For those, who even just intend to use Google Maps as a platform to realise their new mashup idea, planning ahead is extremely difficult. Do you want to spend a few weeks or months developing some fancy new feature, which is not available on Google Maps yet, or not exposed in the API just yet – and then find exactly that feature being added after you have engineered it yourself? If you are still working on it, you may save yourself some night shifts and this might bring at least some excitement besides the frustration that you have just wasted a lot of valuable time on inventing a wheel which is now given away to anyone for free! Or go and risk it – you may become famous in the visionary or technology enthusiasts departments and communities for a few days, and if you are quick and successful in crossing the chasm, you may become even more famous and rich – or you may end up at the bottom of the crevasse – with all the others that vanish in the awesomeness icefall.
I’ll give you an example: Google released a static maps API earlier this year. Then, a little later, Google added support for more marker options and paths on the static maps along with the message that lines can only contain 50 vertice points. Now if you need a hundred? Draw two lines? Should be an OK hack. If you need 1000 – hmm, how about building something yourself. Sophisticated server-side solution? Enough tools out there to do something like that… Or wait until Google decides that lines can be longer now? You decide! Good luck – and be sure to not be frustrated, but excited instead, if things keep changing… you know!
If you want a bit more planning, and you can afford to do without a substantial set of free geodata, then you may answer the question, „must it always be Google Maps?“ with a clear „no“! For example, follow the Open Layers Blog and you have a pretty good idea about what will be next – in fact, you can even contribute to make it happen faster!
Excitement and frustration are often very close together and timing is what may decide whether you will smile or get annoyed digesting your daily news. I’ll start my day going out for a run now – awesome nature is waiting for me!
The Google Lat Long Blog posted some exciting news today. People can now edit locations on Google Maps and, for example, adjust the position of a marker on the map to provide more accurate positioning information. Some of the interesting questions that arise in context with this new feature concern Google’s backend geocoding database. Google need reference geo-datasets to do their geocoding. Let’s assume that almost all of the marker positions are at least slightly off – let’s assume that heaps of people will love to help out and correct these marker positions. When Google serves geocoding requests, they will use the updated information to provide more accurate results for the user. Some of the businesses that sell geocoding reference datasets charge customers by the amount of geocoding requests to the reference database. Does Google have to count when someone searches for an address that has been manually corrected? Does this need to be logged as a normal geocoding request? Will this „enhanced“ and crowdsourced geocoding reference information flow back to the original data providers? Who owns the coordinates? Besides the issue of reliability, isn’t that a great new dataset that will emerge? Or am I missing that the providers of geocoding reference databases still have street name information and the like, which kind of „belongs“ to them and is still needed for geocoding? Any chances of Google freeing themselves more and more from huge geocoding bills?
More details about the new feature can also be found in a post on the Understanding Google Maps and Yahoo Local Search Blog