← Back to context

Comment by tavioto

4 days ago

Good post, however what most people fail to see is that the quality of the result from a geocoder greatly depends on the quality of the input data.

Google has good normalization of that input address to identify and correct it but if the input has too much info like instructions to find the address or variations of the street name, phone numbers etc, then it gets confused and the results will not be good.

I deal with lots of very poorly formatted addresses for last mile and delivery companies and they have horrible data, so I created AddressHub which mostly focus on address normalization but also offers a router capability that connects to multiple geocoders and analyze the results to prevent false positives (results that are deemed of high accuracy but are even miles away from the real location).

It also provides caching according to the terms and conditions of the geocoders and also checks first with open data to see if the address can be geocoded with it (at no cost) then do a fallback mechanism with the geocoders to find the best result.

Take a look at it and let me know what I should improve: www.address-hub.com