First, I am big fan of your articles even before I joined IPinfo, where we provide IP geolocation data service.
Our geolocation methodology expands on the methodology you described. We utilize some of the publicly available datasets that you are using. However, the core geolocation data comes from our ping-based operation.
We ping an IP address from multiple servers across the world and identify the location of the IP address through a process called multilateration. Pinging an IP address from one server gives us one dimension of location information meaning that based on certain parameters the IP address could be in any place within a certain radius on the globe. Then as we ping that IP from our other servers, the location information becomes more precise. After enough pings, we have a very precise IP location information that almost reaches zip code level precision with a high degree of accuracy. Currently, we have more than 600 probe servers across the world and it is expanding.
The publicly available information that you are referring to is sometimes not very reliable in providing IP location data as:
- They are often stale and not frequently updated.
- They are not precise enough to be generally useful.
- They provide location context at an large IP range level or even at organization level scale.
And last but not least, there is no verification process with these public datasets. With IPv4 trade and VPN services being more and more popular we have seen evidence that in some instances inaccurate information is being injected in these datasets. We are happy and grateful to anyone who submits IP location corrections to us but we do verify these correction submissions for that reason.
From my experience with our probe network, I can definitely say that it is far easier and cheaper to buy a server in New York than in any country in the middle of Africa. Location of an IP address greatly influences the value it can provide.
We have a free IP to Country ASN database that you can use in your project if you like.
Great idea with latency triangulation, I used latency information for a lot of things, especially VPN and Proxy detection.
But I didn't assume you can obtain that accurate location. I am honestly impressed. But latency triangulation with 600 servers gives some very good approximation. Nice man!
Some questions:
- ICMP traffic is penalised/degraded by some ISP's. How do you deal with that?
- In order to geolocate every IPv4 address, you need to constantly ping billions of IPv4's, how do you do that? You only ping an arbitrary IP of each allocated inetnum/NetRange?
- Most IP addresses do not respond to ICMP packets. Only some servers do. How do you deal with that? Do you find the router in front of the target IP and you geolocate the closest router to the target IP (traceroute)?
I used to do freelance web scraping, and that article felt like some kind of forbidden knowledge. After reading the article, I went down the rabbit hole and actually found a Discord server that provided carrier-grade traffic relay from a van which contained dozens of phones.
For the questions..... we have to kinda wait a bit, someone from our engineering team might come here and reply.
By the way, as I have you here have you considered converting the CSV files to MMDB format? I was planning to do that with our mmdbctl tool later today.
I'm very curious why you'd do VPN/proxy detection...
But at a previous company I worked at that ran a very large chunk of the internet, we did indexing of nearly the entire internet (even large portions of the dark web) approximately every two weeks. There were about 500 servers doing that non-stop. So, I think it is relatively reasonable if you have 600 servers to do that.
Great comment. I'm a big fan and customer of IPinfo, using your API in our login notification emails to say "You just logged in from Berlin, Germany. If this wasn't you click here." To provide country data for customers in their audit logs. And for anti-spam and fraud detection.
I appreciate it, sir! If you have any questions or feedback, please let us know.
The challenge of being a data provider is that you can use our data in a million ways, and we don't have coverage of all. So, when you come up with questions or ideas, we can help you better.
As you mentioned, audit logs. I highly recommend you look into the ASN field.
The ASN identifies an organization that owns a block of IP addresses. In my experience, I have found that the combination of ASN+Country is the most valuable information you can use in spam and fraud detection. You can fake the IP geolocation information with a VPN. However, it is not as easy to fake the ASN information of the IP address. So, when you use a combination of country + ASN, you can have a robust cybersecurity system.
Have you considered making your database available for download as Parquet format so people could just copy the file to S3, Google Cloud, etc, and query it immediately with various tools?
I know it can be done with CSV but it's not as smooth.
We usually just send users the documentation of ingesting the data in CSV or NDJSON format (Newline Delimited JSON). We don't actually get many requests for data downloads in Parquet format. I think we have a few customers where we deliver the data in parquet format directly to their cloud storage bucket.
But keep an eye out for our emails if we announce the parquet data downloads. I will talk with the folks about this.
BUT, there are some good news.
At least for the free database, we deliver the data directly to data warehouse platforms. Not even storage buckets. And we supply a good amount of documentation.
We have the free database in Snowflake, GCP, Kaggle, and Splitgraph, and we are working on a few more deals. For the free database, atleast, we are working on better things than parquet. Like literally one-click solution to bring the IP data to your data warehouse.
If you want to use our free IP database on Google Cloud or BigQuery, please send us an email (support@ipinfo.io) and mention that the DevRel sent you from HN. I can easily set you up with the free IP database in GCP/BQ.
Your comment is extremely interesting and what I was hoping to learn from the article (without an existing source of information, how do we determine the location of an IP address). Thank you!
The only update is the number of servers is like 600+ now. The probe network is growing extremely rapidly.
Our IP geolocation process is quite complicated, and we have a team of data engineers, infrastructure engineers, and data scientists working on various aspects of it. Therefore, our approach is users can ask us questions, and we will try our best to answer them.
Would you consider no-signup inspection of the data you hold on the requesters IP address? I would love to see what you have on MY IP address, and if sufficiency accurate it feels that it would be a good incentive to sign up to use commerically.
It feels like it couldn't be abused by 'freeloaders', because i'd guess their use-case is viewing other peoples.
We have a very open approach to our data. In fact, our website is extremely accessible. It is quite useful for researching IP addresses and does not require signing up. The data is largely available to view on the website. Although we display all IP address meta data on the home page, if you intend to use our website frequently, I recommend utilizing the IP data pages.
Thousands of people live in a zip code, while hundreds and thousands of people live in a city. We are literally giving away that data for free through our API and database. The creepiness of IP geolocation is mostly a meme.
IP geolocation is mainly used in cybersecurity and marketing analytics. There are many ways to geolocate someone. I once came across a project that could estimate the country a user is from based on their writing style and grammar mistakes. For example, American people sometimes use "should of" instead of "should have". Knowing the geolocation of an IP address isn't super creepy. It's just how things work on the internet.
You might want to unplug your router then. A conceit of being connected to a network is you're connected to the network. If you can see other nodes they can see you.
It is just ping data. We ping an IP address, get the RTT, draw a radius on the globe, and say that the IP could be anywhere inside that radius. Then we do another ping and draw another radius, and at the cross-section of the two radii could be your IP address. Now, if we do it enough times, we can get an estimate of where the IP address is located.
The data is not derived from the IP address itself, but rather from the process itself. And it's just a ping. Moreover, the majority of the IP addresses are not pingable. So, we rely on other in house statistical and scientific models to estimate the location. The probe infrastructure is extremely complicated and there are billions and billions of IP addresses, which is why we do not have a robust range filter mechanism.
But I encountered 2 things using ipinfo: Hetzner Server that are in Germany in a fixed location that never moved are sometimes located in another country, for me it was once s Server placed into Moscow and once in South America.
When a company buys an IP address block or relocates an IP block from one of its data centers to another, the location of those IP addresses changes.
If your IP address is static, but we have made an error in geolocation, I would love to take a closer look. You can email our support (support@ipinfo.io) and send a link to the comment. We can discuss it further from there.
We don't have any free data for that. We have historical data that we sell as part of our custom enterprise deals. Historical data requests are rare, though.
A time series IP database requires a substantial amount of storage and computational cost to query, as I imagine. The city level geolocation data we have is ~1.5 gb in size. IP range data is complicated to query efficiently as you need to understand data platform settings and good amount of computer network math and computer science stuff. Adding a layer of time series complexities on top of that, makes this process quite difficult.
To give you some context of how IP metadata lookups work, you can check out this article
Our VPN recognition is behavior-based. So, there is probably a chance that the IP address you are using is showing some of those behavior patterns.
A behavior pattern could be that your IP address is being shuffled around random locations that go beyond the normal location shuffling of an ISP connection.
Also, if your IP range is listed in some public datasets that belong to a VPN service, we could recognize your IP as a VPN.
Please reach out to our support and let us know about this. Thanks
If you don't want to do this yourself, you can actually just get Cloudflare to do it for you for free using a simple Worker since all Cloudflare requests contain approximate IP location information.
You can also just send a request to my URL (Cloudflare Worker operated - so it should have global low latency): https://www.edenmaps.net/iplocation
Use it for small applications, I don't mind. Just don't start sending me 10M requests per day ;-)
The result is [lon, lat]. You’ve most likely copied it onto Google maps, which works with [lat, lon]. Believe it or not, the industry still hasn’t come up with a standard order.
As accurate as MaxMind[1], since that's what they use [2]. In my experience, it's reasonably accurate for the US, less so for other countries. MaxMind publishes some accuracy data which might be an interesting starting point [3]
That said, for any analytics use cases of this data, be aware that MaxMind will group a lot of what should be unknowns in the middle of a country. Or, in the case the US now, I think they all end up in the middle of some lake, since some farm owners in Butler County, Kansas got tired of cops showing up and sued MaxMind. It can cause odd artifacts unless you filter the addresses out somehow.
I work for IPinfo and we do ping based geolocation. The best thing you can do to verify geolocation accuracy is the following:
- Download a few free IP databases
- Generate a random list of IP addresses
- Do the IP address lookups across all those databases
- Identify the IP address that can be pinged
- Visit a site that can ping an IP address from multiple server
- Sort the results by lowest avg ping time
Then check where the geolocation provider is locating the IP address and what is the nearest server from there.
export function onRequest(context) {
return new Response(JSON.stringify([parseFloat(context.request.cf.longitude), parseFloat(context.request.cf.latitude)]), {headers: {"Content-Type": "application/json;charset=UTF-8"}})
}
This is a function on Cloudflare Pages (which is just a different name for Cloudflare Workers). Minor adjustment needed for Workers (get rid of "context", I believe)
As someone that lives in a country where the national language is not my first language, I hate websites that use IP location to make assumptions about my choice of language and it being forced on me based on a lazy assumption, when my browser is sending language headers quite clearly, and they are ignored.
I work for an IP geolocations service, even I hate this thing.
They call it "web experience personalization" in the industry, and it is annoying. I have never recommended anyone to do that. The best way to do website personalization through IP geolocation:
- Taxes and stuff (if applicable)
- Delivery costs (if applicable)
- Putting the user's country first in those country selection drop-down menu
And that's about it from the top of my head. In my experience, these translations never work and only create distractions. Regardless of the positive intention the website has, using Google Translate to create a native language version of the website is just not a good idea.
A fair number of popular internationalization frameworks also drive the idea that region and language are fixed pairs.
The example I often use to illustrate this problem is that there are roughly 4 million Norwegian speakers in the world, but 14 million speakers of Catalan. Visit an international website in Spain and you rarely get given the option to have it in Catalan.
I live in the US, and IP geolocation points to the incorrect regions (plural) on all my devices.
Few technologies manage to make my day-to-day internet experience than these sorts of databases.
I wish they would just go away.
Websites could just ask me my zipcode on first load instead of guessing it wrong every single time and then burying the flow to fix it behind multiple links and page loads.
Also: There is no way to fix the database to produce the “correct” or “better” answer. I rarely want a website to use my current location.
Instead, I check inventory for stores in places where I will be. This whole space is trying to solve an ill-posed problem.
It all depends on what you want to use it for and how accurate it needs to be.
The best way to build a geolocation service is to have a billion devices that report their location to you at the same time they report their IP to you. That's basically Apple and Google. They have by far the best geolocation databases in the world, because they get constant updates of IP and location.
The trick is basically to make an app where people willingly give you their location, and then get a lot of people to use it. That's the best way to build an accurate geo-location database, and why every app in the world now asks for your location.
4-square had the right idea, they were just ahead of their time.
Even 10 years ago, Apple internal privacy policies prevented itself from collecting precise lat/long. We had to use HTTP session telemetry to determine which endpoints were best for a given IP (or subnet, but not ASN), which informed our own pseudo-geoIP database so we knew which endpoint to connect to based on real world conditions.
Even still, it had to be as ephemeral as possible for the sake of privacy. We weren’t allowed to use or record results from Apple Maps’ reverse geo service outside of the context of a live user request (finding nearby restaurants, etc).
Somewhat relevant: Google Maps can learn the location of your IP based on which locations you browse in the map. If you browse a specific location enough times, it will use that as the default location when you open Google Maps, even if you clear all cookies. (I discovered this just from using Google Maps, and I'm a little concerned by the privacy implications, considering that multiple people may share an IP address.)
Google certainly uses its geolocation DB, but it also learns based on map browsing patterns.
To clarify, the scenario I described is as follows: 1. Initially, when I open Google Maps in a clean browser it defaults to my real location. 2. I repeatedly browse some other location. 3. When I open Google Maps in a clean browser, it defaults to that other location. The only reason for Google Maps to pick that other location is my map browsing.
Interesting but this isn't actually how geolocation is done, right? The ARIN/RIPE data isn't sufficiently accurate to be useful beyond country. Commercial geolocation involves correlating client IP vs known physical location e.g. from WiFi AP or mailing a package to the user. At least that's what I have been told over the decades.
I work in adtech and this is how we do geolocation. There's also device geolocation but if the user doesn't consent to sharing their GPS data with us, we just use IP address for targeting. Common provider for this is Maxmind; they ship a database that you host locally and query
Comments seem fairly dismissive but I actually found this really interesting. It reminds me of a task I had in my first position to add PostGIS to our database and a location based search. That was based off addresses and zipcodes.
So, at the risk of outing myself, I wrote http://www.hostip.info a long time ago* which used a community approach to get ip address location ("is this guess wrong ? Fix it please").
The last time I checked (maybe a decade ago [grin]) it worked pretty much perfectly for a country, imperfectly for a region, and better-than-a-coin-toss for city resolution. All the data is free.
I don't think they have it on the site any more, but I used to have a rotating 3D-cube thing (x,y,z were the first 3 octets of the address) for things like known-addresses, recent lookups, etc. I used different colours for different groups (country, continent,...) It was so old it was written as a Java applet. Yeah. I guess if I were to do it again, it'd be WebGL.
--
*: I sold it a long time ago, with the proviso that the data must always remain free. I actually didn't believe the offer at first (it came as an email, and looked like a scam) but it went through escrow.com just fine, and I think we both walked away happy. That was almost 2 decades ago now though.
This just links to a mmdb file that is already compiled, there isn't anything relevant to show this is a "modern" implementation of anything if the implementation isn't available.
Any suggestions for geolocating datacenter IPs, even very roughly? I'm analysing traceroute data, and while I have known start and end locations, it's the bit in the middle I'm interested in.
I can infer certain details from airport codes in node hostnames, for example.
It would also be possible - I guess - to infer locations based on average RTT times, presuming a given node's not having a bad day.
Anyone have any other ideas?
Edit: A couple of troublesome example IPs are 193.142.125.129, 129.250.6.113, and 129.250.3.250. They come up in a UK traceroute - and I believe they're in London - but geolocate all over the world.
Those IPs are owned by Google and NTT, who both run large international networks and can redeploy their IPs around the world when they feel like it. So lookup based geolocation is going to be iffy, as you've seen.
Traceroute to those IPs certainly looks like the networking goes to London.
The google IP doesn't respond to ping, but the NTT/Verio ones do. I'd bet if you ping from London based hosting, you'll get single digit ms ping responses, which sets an upper bound on the distance from London. Ping from other hosting in the country and across the channel, and you can confirm the lowest ping you can get is from London hosting, and there you go. It could also be that its connectivity is through London, but it's elsewhere --- you can't really tell.
Check from other vantage points, just to make sure it's not anycast; if you ping 8.8.8.8 from most networks around the world, you'll get something nearby; but these IPs give traceroutes to london from the Seattle area, so probably not anycast (at least at the moment, things can change).
If you don't have hosting around the world, search for public looking glasses at well connected network that you can use for pings like this from time to time.
"TULIP's purpose is to geolocate a specified target host (identified by IP name or address) using ping RTT delay measurements to the target from reference landmark hosts whose positions are well known (see map or table)."
> A couple of troublesome example IPs are 193.142.125.129, 129.250.6.113, and 129.250.3.250. They come up in a UK traceroute - and I believe they're in London - but geolocate all over the world.
If I'm running a popular app/web service, I would have my own AS number and I will have purchased a few blocks of IP addresses under this AS and then I would advertize these addresses from multiple owned/rented datacenters around the world.
These BGP advertisements would be to my different upstream Internet service providers (ISPs) in different locations.
For a given advertisement from a particular location, if you see a regional ISP as upstream, you can make an educated guess that this particular datacenter is in that region. If these are Tier 1 ISPs who provide direct connectivity around the world, then even that guess is not possible.
If you have ability to do traceroute from multiple probes sprinkled across the globe with known locations, then you could triangulate by looking at the fixed IPs of the intermediate router interfaces.
Even this is is defeated if I were to use a CDN like Cloudflare to advertise my IP blocks to their 200+ PoPs and ride their private networks across the globe to my datacenters.
> If you have ability to do traceroute from multiple probes sprinkled across the globe with known locations
Everyone who's aware of RIPE Atlas has that ability.
I have almost a billion RIPE Atlas credits. A single traceroute costs 60. I have enough credits to run several traceroutes on the entire IPv4 internet. (the smallest possible BGP announcement is /24, so max of 2^24 traceroutes, but in reality it's even less).
I have to scrape the whole IP address space since I offer location information as part of my API.
Also I only need to scrape as many WHOIS records as there are different networks out there. So for example for the IPv4 address space, there are much less networks as there are IPv4 addresses (2^32).
Also, most RIR's provide their WHOIS databases for download.
Therefore, "scraping" is not really the correct word, it's an hybrid approach, but mostly based on publicly available data from the five RIR's.
Maybe a dumb question (I have no knowledge), but why wouldn't we think of .CSV files as databases? It can have columns and rows filled with information and isn't that what makes a thing a database?
Are we really going to do the mincing of words here? Did you need the word "dump" or "export" before you understood? Although I wasn't wild about the original poster's "step 1" terseness, it's silly to think a normal person wouldn't be able to parse the sentence well enough to understand "download the database contents - perhaps stored in CSV format".
If in your mind database implies a type of technology and not something conceptual, you’re really just outing yourself as someone that needs someone between you and the boardroom. Certainly not something to show off on Hacker News.
I think it's interesting that the one IP range I decided to check has correct information on the ipapi.is web site, but unambiguously incorrect information in the downloadable geolocationDatabaseIPv4.csv. Somehow Bedford, New Hampshire (which came straight from WHOIS) became Bedford, Texas.
I’m duplicating my comment elsewhere in this thread, so each serves as a direct reply to the different geolocation providers in this thread, in the hope that it will be recognized as a problem with data that implies that it’s more precise than it really is:
> On one hand, I love that there’s some good alternatives in the geolocation space, but misleading geolocation precision can lead to very undesirable side effects[0].
I feel like a more useful and accurate way would be to buy client ip and GPS location data in bulk from one of the mobile data brokers who have their spyware embedded in zillions of popular apps/games and then group it by /24 or something.
Fantastic stuff. I work for IPinfo.io and I actually came across your site about a month ago. I was planning to contact to you about mentioning the free IP databases [0].
However, when I saw that a few API didn't return any response, I thought maybe the site was not maintained.
I find the geographic coordinate values returning up to 15 decimal places is absurd for an IP geolocation response. IP geolocation is never that precise and, this level of "precision" is not warranted and frankly distracting. Like at best it should be 4 decimal places.
I expected traceroute to play a bigger part in this. If you know the route to an IP address and the location of routers, perhaps even from a few different servers, then you should be able to locate it fairly well.
Maybe not exactly what you're looking for, but the ipfire project has a git repository[1] mapping address ranges to countries. It apparently going back to 2017 only.
a long time ago i build a project like that
but instead of relying on whois. i did a traceroute to every ipv4 address avaliable.
several router hops, have a reverse dns that uses some names that include city codes, (like airport codes ).
most providers have a single hop for a city. so its easy to correlate the latest router hop to a city.
The easiest way to get a geolocation is to ask the user. Maybe they’ll just tell you, and if that’s good enough for your application there’s no need for such solutions.
First, I am big fan of your articles even before I joined IPinfo, where we provide IP geolocation data service.
Our geolocation methodology expands on the methodology you described. We utilize some of the publicly available datasets that you are using. However, the core geolocation data comes from our ping-based operation.
We ping an IP address from multiple servers across the world and identify the location of the IP address through a process called multilateration. Pinging an IP address from one server gives us one dimension of location information meaning that based on certain parameters the IP address could be in any place within a certain radius on the globe. Then as we ping that IP from our other servers, the location information becomes more precise. After enough pings, we have a very precise IP location information that almost reaches zip code level precision with a high degree of accuracy. Currently, we have more than 600 probe servers across the world and it is expanding.
The publicly available information that you are referring to is sometimes not very reliable in providing IP location data as:
- They are often stale and not frequently updated.
- They are not precise enough to be generally useful.
- They provide location context at an large IP range level or even at organization level scale.
And last but not least, there is no verification process with these public datasets. With IPv4 trade and VPN services being more and more popular we have seen evidence that in some instances inaccurate information is being injected in these datasets. We are happy and grateful to anyone who submits IP location corrections to us but we do verify these correction submissions for that reason.
From my experience with our probe network, I can definitely say that it is far easier and cheaper to buy a server in New York than in any country in the middle of Africa. Location of an IP address greatly influences the value it can provide.
We have a free IP to Country ASN database that you can use in your project if you like.
https://ipinfo.io/developers/ip-to-country-asn-database
Big fan of what articles? On https://incolumitas.com/ or on https://ipapi.is/?
Great idea with latency triangulation, I used latency information for a lot of things, especially VPN and Proxy detection.
But I didn't assume you can obtain that accurate location. I am honestly impressed. But latency triangulation with 600 servers gives some very good approximation. Nice man!
Some questions:
- ICMP traffic is penalised/degraded by some ISP's. How do you deal with that?
- In order to geolocate every IPv4 address, you need to constantly ping billions of IPv4's, how do you do that? You only ping an arbitrary IP of each allocated inetnum/NetRange?
- Most IP addresses do not respond to ICMP packets. Only some servers do. How do you deal with that? Do you find the router in front of the target IP and you geolocate the closest router to the target IP (traceroute)?
https://incolumitas.com/
This is my all-time favorite article: https://incolumitas.com/2021/11/03/so-you-want-to-scrape-lik...
I used to do freelance web scraping, and that article felt like some kind of forbidden knowledge. After reading the article, I went down the rabbit hole and actually found a Discord server that provided carrier-grade traffic relay from a van which contained dozens of phones.
For the questions..... we have to kinda wait a bit, someone from our engineering team might come here and reply.
By the way, as I have you here have you considered converting the CSV files to MMDB format? I was planning to do that with our mmdbctl tool later today.
https://github.com/ipinfo/mmdbctl
1 reply →
I'm very curious why you'd do VPN/proxy detection...
But at a previous company I worked at that ran a very large chunk of the internet, we did indexing of nearly the entire internet (even large portions of the dark web) approximately every two weeks. There were about 500 servers doing that non-stop. So, I think it is relatively reasonable if you have 600 servers to do that.
9 replies →
You can guess pretty well how IP's are related by BGP announcements, so as long as a few per block and if small, ASN. You can use that logic.
ICMP response time not useful for “locating” an anycasted address, some of which have logical location associated with them. See https://blog.cloudflare.com/icloud-private-relay/ for an example
Well, at least you can detect it is an anycast address, and mark it as such.
Great comment. I'm a big fan and customer of IPinfo, using your API in our login notification emails to say "You just logged in from Berlin, Germany. If this wasn't you click here." To provide country data for customers in their audit logs. And for anti-spam and fraud detection.
I appreciate it, sir! If you have any questions or feedback, please let us know.
The challenge of being a data provider is that you can use our data in a million ways, and we don't have coverage of all. So, when you come up with questions or ideas, we can help you better.
As you mentioned, audit logs. I highly recommend you look into the ASN field.
The ASN identifies an organization that owns a block of IP addresses. In my experience, I have found that the combination of ASN+Country is the most valuable information you can use in spam and fraud detection. You can fake the IP geolocation information with a VPN. However, it is not as easy to fake the ASN information of the IP address. So, when you use a combination of country + ASN, you can have a robust cybersecurity system.
2 replies →
Have you considered making your database available for download as Parquet format so people could just copy the file to S3, Google Cloud, etc, and query it immediately with various tools?
I know it can be done with CSV but it's not as smooth.
Thank you for the feature request.
We usually just send users the documentation of ingesting the data in CSV or NDJSON format (Newline Delimited JSON). We don't actually get many requests for data downloads in Parquet format. I think we have a few customers where we deliver the data in parquet format directly to their cloud storage bucket.
But keep an eye out for our emails if we announce the parquet data downloads. I will talk with the folks about this.
BUT, there are some good news.
At least for the free database, we deliver the data directly to data warehouse platforms. Not even storage buckets. And we supply a good amount of documentation.
We have the free database in Snowflake, GCP, Kaggle, and Splitgraph, and we are working on a few more deals. For the free database, atleast, we are working on better things than parquet. Like literally one-click solution to bring the IP data to your data warehouse.
Kaggle: https://www.kaggle.com/code/ipinfo/ipinfo-ip-to-country-asn-...
Snowflake: https://app.snowflake.com/marketplace/listing/GZSTZSHKQ4QY/i...
If you want to use our free IP database on Google Cloud or BigQuery, please send us an email (support@ipinfo.io) and mention that the DevRel sent you from HN. I can easily set you up with the free IP database in GCP/BQ.
Your comment is extremely interesting and what I was hoping to learn from the article (without an existing source of information, how do we determine the location of an IP address). Thank you!
I really appreciate. Thank you. We are very transparent about our process. If you have any questions, you can always reach out to us.
We have a simplified explanation of our probe network here: https://ipinfo.io/blog/probe-network-how-we-make-sure-our-da...
The only update is the number of servers is like 600+ now. The probe network is growing extremely rapidly.
Our IP geolocation process is quite complicated, and we have a team of data engineers, infrastructure engineers, and data scientists working on various aspects of it. Therefore, our approach is users can ask us questions, and we will try our best to answer them.
5 replies →
I just noticed that my wifes iphone uses the same mycingular ip address while driving accross 3 states over 5 hours.l while checking mail.
There's several options/techniques for doing it. But just imagine you have a permanent zero overhead VPN.
I don't know if that provider terminates long running calls, but the calls would stay up too regardless of tower.
3 replies →
Would you consider no-signup inspection of the data you hold on the requesters IP address? I would love to see what you have on MY IP address, and if sufficiency accurate it feels that it would be a good incentive to sign up to use commerically.
It feels like it couldn't be abused by 'freeloaders', because i'd guess their use-case is viewing other peoples.
We have a very open approach to our data. In fact, our website is extremely accessible. It is quite useful for researching IP addresses and does not require signing up. The data is largely available to view on the website. Although we display all IP address meta data on the home page, if you intend to use our website frequently, I recommend utilizing the IP data pages.
You can enter IP addresses on the right side to look up information here: https://ipinfo.io/what-is-my-ip
Additionally, we offer some enjoyable tools that you can use here: https://ipinfo.io/tools
The CLI tool is particularly entertaining.
You can also use our API service without signing up, with a limit of 1000 requests per day.
If you do choose to sign up for a free account, you will receive 50,000 requests per month, free IP databases, a bulk lookup feature, and more.
This is literally the most prominent thing on the https://ipinfo.io home page.
3 replies →
That's pretty neat! You're basically using ping triangulation!
Trilateration (same technique as used for mobile network location - in addition to the GPS on the phone)
Not gonna lie, this creeps the heck out of me.
Thousands of people live in a zip code, while hundreds and thousands of people live in a city. We are literally giving away that data for free through our API and database. The creepiness of IP geolocation is mostly a meme.
IP geolocation is mainly used in cybersecurity and marketing analytics. There are many ways to geolocate someone. I once came across a project that could estimate the country a user is from based on their writing style and grammar mistakes. For example, American people sometimes use "should of" instead of "should have". Knowing the geolocation of an IP address isn't super creepy. It's just how things work on the internet.
10 replies →
You might want to unplug your router then. A conceit of being connected to a network is you're connected to the network. If you can see other nodes they can see you.
1 reply →
Your IP address is LEAKING!
Together with the tons of data leaked by browsers it makes it very easy to track people across places and devices.
Can your probes be identified and blocked?
It is just ping data. We ping an IP address, get the RTT, draw a radius on the globe, and say that the IP could be anywhere inside that radius. Then we do another ping and draw another radius, and at the cross-section of the two radii could be your IP address. Now, if we do it enough times, we can get an estimate of where the IP address is located.
The data is not derived from the IP address itself, but rather from the process itself. And it's just a ping. Moreover, the majority of the IP addresses are not pingable. So, we rely on other in house statistical and scientific models to estimate the location. The probe infrastructure is extremely complicated and there are billions and billions of IP addresses, which is why we do not have a robust range filter mechanism.
You can implement a dynamic ping blocking mechanism or use our data to find hosting ASNs and block ranges of those ASNs. You can download the database for free: https://ipinfo.io/developers/ip-to-country-asn-database
13 replies →
How does that work with edge servers that use anycast to assume the same IP across different regions?
Aren’t any cast addresses a specific subset of ips and thus knowable? Iirc, each autonomous system is allocated anycast ip space?
1 reply →
Hi, cool idea with the geolocation via latency.
But I encountered 2 things using ipinfo: Hetzner Server that are in Germany in a fixed location that never moved are sometimes located in another country, for me it was once s Server placed into Moscow and once in South America.
How does this happen?
If you can give me some information about the IP address or the IP range, I can take a closer look.
I guess it is because of IPv4 trading or IP address shuffling.
As far as I know, Hertzer, like many hosting companies, is buying IPv4 addresses around the world. Here is an article on the IPv4 trades:
https://tech.marksblogg.com/ipinfo-free-ip-address-location-...
When a company buys an IP address block or relocates an IP block from one of its data centers to another, the location of those IP addresses changes.
If your IP address is static, but we have made an error in geolocation, I would love to take a closer look. You can email our support (support@ipinfo.io) and send a link to the comment. We can discuss it further from there.
Are there any historical sources for geo ip info?
We don't have any free data for that. We have historical data that we sell as part of our custom enterprise deals. Historical data requests are rare, though.
A time series IP database requires a substantial amount of storage and computational cost to query, as I imagine. The city level geolocation data we have is ~1.5 gb in size. IP range data is complicated to query efficiently as you need to understand data platform settings and good amount of computer network math and computer science stuff. Adding a layer of time series complexities on top of that, makes this process quite difficult.
To give you some context of how IP metadata lookups work, you can check out this article
https://ipinfo.io/blog/ip-address-data-in-snowflake/
Even if you keep all your database in a binary format, the computational cost is still non-negligible.
hm, ipinfo.io tells me that I'm using a VPN even though I'm not...
Our VPN recognition is behavior-based. So, there is probably a chance that the IP address you are using is showing some of those behavior patterns.
A behavior pattern could be that your IP address is being shuffled around random locations that go beyond the normal location shuffling of an ISP connection.
Also, if your IP range is listed in some public datasets that belong to a VPN service, we could recognize your IP as a VPN.
Please reach out to our support and let us know about this. Thanks
1 reply →
[dead]
If you don't want to do this yourself, you can actually just get Cloudflare to do it for you for free using a simple Worker since all Cloudflare requests contain approximate IP location information.
You can also just send a request to my URL (Cloudflare Worker operated - so it should have global low latency): https://www.edenmaps.net/iplocation
Use it for small applications, I don't mind. Just don't start sending me 10M requests per day ;-)
Or you download an IP database rather than sharing with a third party which IP address is likely connecting to your service with a third party
Located 100km from the Somali coast... I'm in Brussels, Belgium, thx for protecting my privacy :D
The result is [lon, lat]. You’ve most likely copied it onto Google maps, which works with [lat, lon]. Believe it or not, the industry still hasn’t come up with a standard order.
1 reply →
Got me to within 1km, that's pretty crazy
1 reply →
Does anyone know how accurate Cloudflare geolocation is (for workers requests)?
As accurate as MaxMind[1], since that's what they use [2]. In my experience, it's reasonably accurate for the US, less so for other countries. MaxMind publishes some accuracy data which might be an interesting starting point [3]
That said, for any analytics use cases of this data, be aware that MaxMind will group a lot of what should be unknowns in the middle of a country. Or, in the case the US now, I think they all end up in the middle of some lake, since some farm owners in Butler County, Kansas got tired of cops showing up and sued MaxMind. It can cause odd artifacts unless you filter the addresses out somehow.
1 https://developers.cloudflare.com/support/network/configurin...
2 https://www.maxmind.com/en/geoip-demo
3 https://www.maxmind.com/en/geoip2-city-accuracy-comparison
1 reply →
I work for IPinfo and we do ping based geolocation. The best thing you can do to verify geolocation accuracy is the following:
- Download a few free IP databases - Generate a random list of IP addresses - Do the IP address lookups across all those databases - Identify the IP address that can be pinged - Visit a site that can ping an IP address from multiple server - Sort the results by lowest avg ping time
Then check where the geolocation provider is locating the IP address and what is the nearest server from there.
2 replies →
This is excellent!
Would you mind open sourcing the code for that?
This is the code running this endpoint:
This is a function on Cloudflare Pages (which is just a different name for Cloudflare Workers). Minor adjustment needed for Workers (get rid of "context", I believe)
I'm in Munich. Cloudflare tells a position that is 730km to the north in a random forest.
You've inverted lat, lon.
1 reply →
[dead]
As someone that lives in a country where the national language is not my first language, I hate websites that use IP location to make assumptions about my choice of language and it being forced on me based on a lazy assumption, when my browser is sending language headers quite clearly, and they are ignored.
I work for an IP geolocations service, even I hate this thing.
They call it "web experience personalization" in the industry, and it is annoying. I have never recommended anyone to do that. The best way to do website personalization through IP geolocation:
- Taxes and stuff (if applicable)
- Delivery costs (if applicable)
- Putting the user's country first in those country selection drop-down menu
And that's about it from the top of my head. In my experience, these translations never work and only create distractions. Regardless of the positive intention the website has, using Google Translate to create a native language version of the website is just not a good idea.
A fair number of popular internationalization frameworks also drive the idea that region and language are fixed pairs.
The example I often use to illustrate this problem is that there are roughly 4 million Norwegian speakers in the world, but 14 million speakers of Catalan. Visit an international website in Spain and you rarely get given the option to have it in Catalan.
Good example is Amazon.es https://www.amazon.es/customer-preferences/edit?from=mobile&...
I live in the US, and IP geolocation points to the incorrect regions (plural) on all my devices.
Few technologies manage to make my day-to-day internet experience than these sorts of databases.
I wish they would just go away.
Websites could just ask me my zipcode on first load instead of guessing it wrong every single time and then burying the flow to fix it behind multiple links and page loads.
Also: There is no way to fix the database to produce the “correct” or “better” answer. I rarely want a website to use my current location.
Instead, I check inventory for stores in places where I will be. This whole space is trying to solve an ill-posed problem.
I share your frustration so much that I wrote an article about it: https://www.fer.xyz/2021/04/i18n
It all depends on what you want to use it for and how accurate it needs to be.
The best way to build a geolocation service is to have a billion devices that report their location to you at the same time they report their IP to you. That's basically Apple and Google. They have by far the best geolocation databases in the world, because they get constant updates of IP and location.
The trick is basically to make an app where people willingly give you their location, and then get a lot of people to use it. That's the best way to build an accurate geo-location database, and why every app in the world now asks for your location.
4-square had the right idea, they were just ahead of their time.
Even 10 years ago, Apple internal privacy policies prevented itself from collecting precise lat/long. We had to use HTTP session telemetry to determine which endpoints were best for a given IP (or subnet, but not ASN), which informed our own pseudo-geoIP database so we knew which endpoint to connect to based on real world conditions.
Even still, it had to be as ephemeral as possible for the sake of privacy. We weren’t allowed to use or record results from Apple Maps’ reverse geo service outside of the context of a live user request (finding nearby restaurants, etc).
You don't need precise lat/lon to make a good database. Even a 1km circle would be more than enough.
> but not ASN
Why wasn't ASN allowed? That's what Netflix used to make endpoint routing decisions and worked really well.
3 replies →
Somewhat relevant: Google Maps can learn the location of your IP based on which locations you browse in the map. If you browse a specific location enough times, it will use that as the default location when you open Google Maps, even if you clear all cookies. (I discovered this just from using Google Maps, and I'm a little concerned by the privacy implications, considering that multiple people may share an IP address.)
I suspect it's the other way around. Google just has a very good IP geolocation db, so it uses that when you browse, absent any other info.
Google certainly uses its geolocation DB, but it also learns based on map browsing patterns.
To clarify, the scenario I described is as follows: 1. Initially, when I open Google Maps in a clean browser it defaults to my real location. 2. I repeatedly browse some other location. 3. When I open Google Maps in a clean browser, it defaults to that other location. The only reason for Google Maps to pick that other location is my map browsing.
2 replies →
Well it has reporting beacons all over the world with GPS receivers, in the form of Android phones, and perhaps Google Maps users on iPhone too..
That would explain why it sometimes it thinks I'm in a river I paddle often and other times where I have my summer house.
They use this for Google Workspace and data localization (including law enforcement localization).
Interesting but this isn't actually how geolocation is done, right? The ARIN/RIPE data isn't sufficiently accurate to be useful beyond country. Commercial geolocation involves correlating client IP vs known physical location e.g. from WiFi AP or mailing a package to the user. At least that's what I have been told over the decades.
I work in adtech and this is how we do geolocation. There's also device geolocation but if the user doesn't consent to sharing their GPS data with us, we just use IP address for targeting. Common provider for this is Maxmind; they ship a database that you host locally and query
Even the free maxmind db is accurate enough for most applications.
Does Cloudflare have the same data as Maxmind?
Because Cloudflare and Maxmind geolocate me to the exact same longitude/latitude.
1 reply →
Since you are in adtech: do you buy MaxMind, or roll your own? Are there any providers for US-only data, and therefore, cheaper?
1 reply →
[dead]
Comments seem fairly dismissive but I actually found this really interesting. It reminds me of a task I had in my first position to add PostGIS to our database and a location based search. That was based off addresses and zipcodes.
That's relatively simple to do, even in mysql. One trick is to use a square instead of a circle, which avoids a lot of math.
So, at the risk of outing myself, I wrote http://www.hostip.info a long time ago* which used a community approach to get ip address location ("is this guess wrong ? Fix it please").
The last time I checked (maybe a decade ago [grin]) it worked pretty much perfectly for a country, imperfectly for a region, and better-than-a-coin-toss for city resolution. All the data is free.
I don't think they have it on the site any more, but I used to have a rotating 3D-cube thing (x,y,z were the first 3 octets of the address) for things like known-addresses, recent lookups, etc. I used different colours for different groups (country, continent,...) It was so old it was written as a Java applet. Yeah. I guess if I were to do it again, it'd be WebGL.
--
*: I sold it a long time ago, with the proviso that the data must always remain free. I actually didn't believe the offer at first (it came as an email, and looked like a scam) but it went through escrow.com just fine, and I think we both walked away happy. That was almost 2 decades ago now though.
A modern version of the ping-based geoip mentioned
https://github.com/Ne00n/yammdb
This just links to a mmdb file that is already compiled, there isn't anything relevant to show this is a "modern" implementation of anything if the implementation isn't available.
Any suggestions for geolocating datacenter IPs, even very roughly? I'm analysing traceroute data, and while I have known start and end locations, it's the bit in the middle I'm interested in.
I can infer certain details from airport codes in node hostnames, for example.
It would also be possible - I guess - to infer locations based on average RTT times, presuming a given node's not having a bad day.
Anyone have any other ideas?
Edit: A couple of troublesome example IPs are 193.142.125.129, 129.250.6.113, and 129.250.3.250. They come up in a UK traceroute - and I believe they're in London - but geolocate all over the world.
Those IPs are owned by Google and NTT, who both run large international networks and can redeploy their IPs around the world when they feel like it. So lookup based geolocation is going to be iffy, as you've seen.
Traceroute to those IPs certainly looks like the networking goes to London.
The google IP doesn't respond to ping, but the NTT/Verio ones do. I'd bet if you ping from London based hosting, you'll get single digit ms ping responses, which sets an upper bound on the distance from London. Ping from other hosting in the country and across the channel, and you can confirm the lowest ping you can get is from London hosting, and there you go. It could also be that its connectivity is through London, but it's elsewhere --- you can't really tell.
Check from other vantage points, just to make sure it's not anycast; if you ping 8.8.8.8 from most networks around the world, you'll get something nearby; but these IPs give traceroutes to london from the Seattle area, so probably not anycast (at least at the moment, things can change).
If you don't have hosting around the world, search for public looking glasses at well connected network that you can use for pings like this from time to time.
This looked promising:
"TULIP's purpose is to geolocate a specified target host (identified by IP name or address) using ping RTT delay measurements to the target from reference landmark hosts whose positions are well known (see map or table)."
https://tulip.slac.stanford.edu/
But the endpoint it posts to seems dead.
https://ensa.fi/papers/geolocation_imc17.pdf has some ideas.
Using RIPE atlas probes to get RTT to the IPs from known locations is close to your idea and probably the best anyway.
> A couple of troublesome example IPs are 193.142.125.129, 129.250.6.113, and 129.250.3.250. They come up in a UK traceroute - and I believe they're in London - but geolocate all over the world.
If I'm running a popular app/web service, I would have my own AS number and I will have purchased a few blocks of IP addresses under this AS and then I would advertize these addresses from multiple owned/rented datacenters around the world.
These BGP advertisements would be to my different upstream Internet service providers (ISPs) in different locations.
For a given advertisement from a particular location, if you see a regional ISP as upstream, you can make an educated guess that this particular datacenter is in that region. If these are Tier 1 ISPs who provide direct connectivity around the world, then even that guess is not possible.
You can see the BGP relationships in a looking glass tool like bgp.tools – https://bgp.tools/prefix/193.142.125.0/24#connectivity
If you have ability to do traceroute from multiple probes sprinkled across the globe with known locations, then you could triangulate by looking at the fixed IPs of the intermediate router interfaces.
Even this is is defeated if I were to use a CDN like Cloudflare to advertise my IP blocks to their 200+ PoPs and ride their private networks across the globe to my datacenters.
> If you have ability to do traceroute from multiple probes sprinkled across the globe with known locations
Everyone who's aware of RIPE Atlas has that ability.
I have almost a billion RIPE Atlas credits. A single traceroute costs 60. I have enough credits to run several traceroutes on the entire IPv4 internet. (the smallest possible BGP announcement is /24, so max of 2^24 traceroutes, but in reality it's even less).
These IP geolocation lookups never seen to work for me.
They are always multiple states off, and checking multiple different services pretty much never even seem to agree.
"how to scrape an ip geolocation database"
You know you can just run a whois query per ip you want to analyze, no point in scraping the whole ipvN space.
I have to scrape the whole IP address space since I offer location information as part of my API.
Also I only need to scrape as many WHOIS records as there are different networks out there. So for example for the IPv4 address space, there are much less networks as there are IPv4 addresses (2^32).
Also, most RIR's provide their WHOIS databases for download.
Therefore, "scraping" is not really the correct word, it's an hybrid approach, but mostly based on publicly available data from the five RIR's.
What was the easiest and the most frustrating part?
1 reply →
The whois data for IP is not accurate.
whois has no sane format.
RDAP is run by all the RIR and is Json and has all the whois data except IRR.
And it does 302 redirect to best source.
Step 1: Download Geolocation Database
Scroll down, the article is confusingly below that
Step 1: Download Geolocation Data
Unless you think CSV is a database?
Maybe a dumb question (I have no knowledge), but why wouldn't we think of .CSV files as databases? It can have columns and rows filled with information and isn't that what makes a thing a database?
1 reply →
Are we really going to do the mincing of words here? Did you need the word "dump" or "export" before you understood? Although I wasn't wild about the original poster's "step 1" terseness, it's silly to think a normal person wouldn't be able to parse the sentence well enough to understand "download the database contents - perhaps stored in CSV format".
If in your mind database implies a type of technology and not something conceptual, you’re really just outing yourself as someone that needs someone between you and the boardroom. Certainly not something to show off on Hacker News.
>>> Consider Open Source Geolocation Projects
Not the definition of "from scratch" in my book
This is a very useful .csv, what is the license? Is it free for personal and commercial use?
I think it's interesting that the one IP range I decided to check has correct information on the ipapi.is web site, but unambiguously incorrect information in the downloadable geolocationDatabaseIPv4.csv. Somehow Bedford, New Hampshire (which came straight from WHOIS) became Bedford, Texas.
How'd that happen?
Question: What’s the motivation to put coordinates in one’s own WHOIS record? (geoloc/geofeed)
Many service providers actually want their clients to be able to locate them.
I’m duplicating my comment elsewhere in this thread, so each serves as a direct reply to the different geolocation providers in this thread, in the hope that it will be recognized as a problem with data that implies that it’s more precise than it really is:
> On one hand, I love that there’s some good alternatives in the geolocation space, but misleading geolocation precision can lead to very undesirable side effects[0].
[0] https://www.theguardian.com/technology/2016/aug/09/maxmind-m...
geofeed is used by big CDNs, it can actually help save money for the provider by meaning a CDN uses a more optimal network location.
I feel like a more useful and accurate way would be to buy client ip and GPS location data in bulk from one of the mobile data brokers who have their spyware embedded in zillions of popular apps/games and then group it by /24 or something.
Shameless self-promotion:
I built a page to compare IP geolocation providers: https://resolve.rs/ip/geolocation.html
I'll work on adding ipapi.is shortly!
I am using the free version of https://db-ip.com, seems pretty good.
Fantastic stuff. I work for IPinfo.io and I actually came across your site about a month ago. I was planning to contact to you about mentioning the free IP databases [0].
However, when I saw that a few API didn't return any response, I thought maybe the site was not maintained.
[0] https://ipinfo.io/products/free-ip-database
----
Tangent
I find the geographic coordinate values returning up to 15 decimal places is absurd for an IP geolocation response. IP geolocation is never that precise and, this level of "precision" is not warranted and frankly distracting. Like at best it should be 4 decimal places.
relevant xkcd: https://xkcd.com/2170/
I expected traceroute to play a bigger part in this. If you know the route to an IP address and the location of routers, perhaps even from a few different servers, then you should be able to locate it fairly well.
Is anybody maintaining a historical archive of “IP address metadata” (which would include geolocation)?
If I have logs from 10 years ago, can I look up information about that IP as it was at the time?
Maybe not exactly what you're looking for, but the ipfire project has a git repository[1] mapping address ranges to countries. It apparently going back to 2017 only.
[1]: https://git.ipfire.org/?p=location/location-database.git;a=s...
[dead]
Here is a solution for those that care about speed:
https://www.miyuru.lk/geoiplegacy
rfc8805 https://datatracker.ietf.org/doc/rfc8805/
rfc9092 https://datatracker.ietf.org/doc/rfc9092/
a long time ago i build a project like that but instead of relying on whois. i did a traceroute to every ipv4 address avaliable. several router hops, have a reverse dns that uses some names that include city codes, (like airport codes ). most providers have a single hop for a city. so its easy to correlate the latest router hop to a city.
Thanks for sharing.
I have heard there is much effort to use BGP data to build GeoIP database.
Surely someone is using online shopping shipping addresses for this?
What are common use cases for needing IP geolocation?
The easiest way to get a geolocation is to ask the user. Maybe they’ll just tell you, and if that’s good enough for your application there’s no need for such solutions.
[dead]
[dead]
[dead]
Step 1. Download Visual Basic