Optimizing Large-Scale OpenStreetMap Data with SQLite

The author describes converting a massive OpenStreetMap dataset into an SQLite database to improve search functionalities. The dataset, divided into nodes, ways, and relations, was reduced from 100GB to 40GB by filtering data with specific tags. To optimize query performance, the author utilized SQLite’s indexing capabilities and implemented full-text search for fast queries. Further compression using Zstandard reduced the file size to 13GB with improved query speeds. The author plans to rewrite the compression function and reduce false positives in queries. The project showcases iterative refinement and technology integration to enhance search efficiency in a read-only SQL dataset.

https://jtarchie.com/posts/2024-07-02-optimizing-large-scale-openstreetmap-data-with-sqlite

To top