Comment by dhx
11 hours ago
The situation with retail chains is improving thanks to projects such as https://alltheplaces.xyz/ (disclaimer: I'm a contributor) and efforts of some OSM contributors to focus their contributions towards comparing OSM and ATP features to add missing shops, remove closed shops, update opening hours, etc. For one such example, see https://matkoniecz.codeberg.page/improving_openstreetmap_usi... for a tool (created by https://news.ycombinator.com/user?id=matkoniecz) which is used to match and compare OSM and ATP features.
This work has been slow to take off though as the OSM community has traditionally been stuck on time wasting debates about whether opening hours displayed on the wall of a shop are copyrighted (just the raw data, not a photo of their presentation), and debating the merits and pitfalls of armchair mapping vs. on-the-ground mapping. At least these historical roadblocks seem to now be mostly resolved.
For OsmAnd, you might be able to use the OBF import feature (see https://www.osmand.net/docs/user/personal/import-export/) to add the raw ATP dataset, or potentially other open data such as Overture Maps if that is more to your liking. Data is mostly sourced direct from brand websites, APIs, etc (as if you were using a storefinder map on their website).
Interesting project osmand user here mainly in Germany.
In some cities osm data is far more accurate when t comes to opening hours or if a shop actually still exists compared to Google maps. However searching for them is a pain that one needs a bug improvement.
Since I can't rely on the search I usually try to find the poi category and click though the results,super markets,restaurants, pharmacy,atm etc works but so many cliks and caveats. Search needs massive improvement.
> Search needs massive improvement.
Absolutely. Improving this would be a great boost in usability.
I love OsmAnd and I've been using it ever since I've been using phones that can navigate. That's why I've acquired a lot of arcane knowledge on how to find places in the search function. But I could never explain to anyone what I am doing there.
It starts by the mere fact that entering a street name will always search around the current location, which is usually not where you are but the city where you last ran a lookup.
If you want to change the city, there is a tab for that. But consider using postal because sometimes the place's name may be different from what people call it. Sometimes, the same postal code appears multiple times with subsets of streets of the place. So you'll have to go for each one and look for your street. That just happened to be for Avignon (postal code 84000).
Another fun OsmAnd-introduced activity is semi-leaving German Autobahn main tracks onto the side tracks that can be used to drive off but also lead back onto the main track but with more crossing traffic. It just loves to do that.
None of such disadvantages outweigh the level of detail and possibilities in OsmAnd and further in OSM. I love knowing that I could use the same app if I once had to use a wheelchair. I love being able to add notes to a place and getting an E-Mail update months later that someone fixed an issue that I've reported.
And when I use Google Maps every once in a full moon, I run into weird little glitches that surprise me a lot because the one thing I'd expect from this marvel of our monopolistic dystopia is that it "just works" - but it really doesn't. Don't ask me what issues I ran into last time. I forgot and they've probably been replaced by more confusing ones by now :)
Is there a feed of closed shops somewhere? If so, ArchiveTeam could use that to save their websites to archive.org.
https://wiki.archiveteam.org/
Nothing ready-to-go that I'm aware of. ATP will just observe in the next weekly crawl that a shop is no longer returned by the storefinder API call or sitemap crawl, and that shop will simply not be present in the next weekly dataset generated.
To set up archives of shop-specific pages (e.g. record of opening hours, address, etc at a point in time), one could monitor the latest builds of https://alltheplaces.xyz/builds.html and when a new build completes, take the new build and 2nd oldest build to compare differences. Then for any feature whose attributes have changed (address, phone number, opening hours, etc) archive the `website` and/or `source_uri` attribute pages again to ensure the latest snapshot is captured. Any new feature would get the same treatment so the page for the newly observed shop/feature is archived for the first time.
I'm also aware ArchiveTeam projects tend to commence once the impending collapse of a retail chain is known and someone realises there is a website not archived which would be useful to preserve. Monitoring of ATP feature counts for brands across time may give some hint of how a brand is performing and whether it is growing or shrinking without having to find press releases and financial statements of the brand. Even if a brand suddenly announces bankruptcy (it happens all the time), generally the website will remain online for at least a few months whilst a new buyer is sought or whilst each retail location has a fire sale to get rid of remaining merchandise. It's also worthwhile to be aware of acquisitions of retail chains as this often results in the new parent company changing websites soon after acquisition closes, possibly removing useful content that once existed. Websites also change "just because" and this could be observed after-the-fact by seeing when ATP spiders break and get replaced/fixed.