最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

javascript - Google Maps V3 rendering over 1 million markers (in a reasonable time) - Stack Overflow

programmeradmin0浏览0评论

I have recently created a Google Map using V3 of the API (latest version). One of my requirements is that I am able to render over 1 million markers (in a reasonable time). A reasonable time would be under 15 seconds.

I know that it is fairly crazy to render all 1 million markers at once which is why I have investigated performance options. One of the options I came across and utilized is the MarkerClusterer:

However, I am now starting to see performance issues when testing the MarkerClusterer with over 100,000 markers as it is taking a long time to render the map and markers (1 min+). Eventually, I have managed to make the page crash with around 200,000 markers.

Is there any way to improve the performance of the map when using this many markers?

Thanks in advance for any help.

I have recently created a Google Map using V3 of the API (latest version). One of my requirements is that I am able to render over 1 million markers (in a reasonable time). A reasonable time would be under 15 seconds.

I know that it is fairly crazy to render all 1 million markers at once which is why I have investigated performance options. One of the options I came across and utilized is the MarkerClusterer: https://developers.google.com/maps/articles/toomanymarkers

However, I am now starting to see performance issues when testing the MarkerClusterer with over 100,000 markers as it is taking a long time to render the map and markers (1 min+). Eventually, I have managed to make the page crash with around 200,000 markers.

Is there any way to improve the performance of the map when using this many markers?

Thanks in advance for any help.

Share Improve this question asked Mar 17, 2014 at 9:41 JiminyJiminy 1011 gold badge1 silver badge3 bronze badges 2
  • 1 You could consider using a FusionTablesLayer instead, adding circles where you use to draw markers. Does not expect all markers should be responsible :) Havent tested it, but I think that would be a much faster approach. – davidkonrad Commented Mar 17, 2014 at 10:48
  • I will give these some consideration. I have had a quick look at the documentation and it seems as though the data is held by Google so that they can do the processing. This is fine but ill have to see if these can be automatically updated within any manual intervention need. – Jiminy Commented Mar 17, 2014 at 13:43
Add a comment  | 

3 Answers 3

Reset to default 13

I had similar challenge of showing a million+ points on map.

Made use of elastic search clustering on server side and leaflet cluster on the client side.

May be this be helpful..

Demo here

Check the code here

I have used Google Fusion Tables successfully and it is very fast and quite simple - once you work out how to use OAuth2....

The tables are limited to 100,000 entries each and you upload them from a CSV file - either by going to your Google Drive in a browser or programmatically using curl or Perl.

To get beyond the 100,000 element limit, you can add 5 layers to your map but that will still only get you to 500,000 points. Can't suggest anything more than that.

My project is here if you want a look.

Another option would be heat maps.

https://developers.google.com/maps/documentation/javascript/heatmaplayer

Something I have to remind myself (I do note that the original question is from 2014) is that JavaScript is fast. In fact, often when I think JS is to blame, its often the DOM failing.

In 2022 it wouldn't be unreasonable to hold 1,000,000 data points in a constant. The problem then becomes a render issue. Heatmaps, I've found, render pretty quickly as long as the data doesn't change.

Though, even by 2022, I have a hard time believing you can load 1,000,000 records in under 15 seconds. I suppose if the user logs into a dashboard and you start asynchronous loading, it would seem a lot quicker once the user reaches the page. A lot of smart applications make use of the perception of speed when behind the scenes they just smartly preload.

A technic that uses this would be to render the markers/heatmap for the current bounds + 100 mi radius. Then as the user zooms/drags around, you can slice the 1,000,000 item array as needed.

/**
 * A lot of assumptions are being made about the backend. There are various
 * technics you can use to speed up loading, my favorite being MongoDB and
 * $geoWithin and loading multiple pages at the same time.
 */
(async function () {
    let Points = [];
    let Bounds = {};
    let Map;
    let Heatmap;
    let BoundTimeout;

    async function getPoints(page = 1) {
        let response;

        try {
            response = await fetch(myRestUrl, { body: JSON.stringify(Bounds) });
        } catch (e) {
            console.error(e);
        }

        /**
         * If you pass in page, you can return a total in the response. Then, based
         * on your perpage setting you can deduce the total number of pages you
         * need. Throw that in a for loop and add to your Points array.
         *
         * In this example, assuming one phat response...
         */

        if (response) {
            Points = response;
        }
    }

    function generateGoogleMap() {
        // Assumes your map is #map.
        Map = new google.maps.Map(document.getElementById('map'), {
            center: new google.maps.LatLng(37.782, -122.447),
            zoom: 13,
            mapTypeId: 'satellite',
        });

        Map.addListener('bounds_changed', BoundChange);
        BoundChange();
    }

    function BuildHeatmap() {
        if (Heatmap) {
            Heatmap.setMap(null);
            Heatmap = null;
        }

        Heatmap = new google.maps.visualization.HeatmapLayer({
            // Assuming you're storing your points as GeoJSON...
            data: Points.map(
                (P) => new google.maps.LatLng(P.coordinates[1], P.coordinates[0])
            ),
        });

        Heatmap.setMap(Map);
    }

    /**
     * Populates the Bounds variable with a GeoJSON rectangle for Point lookup.
     */
    function BoundChange() {
        clearTimeout(BoundTimeout);
        const MapBounds = Map.getBounds();
        const sw = MapBounds.getSouthWest();
        const ne = MapBounds.getNorthEast();

        Bounds = {
            type: 'Polygon',
            coordinates: [
                [
                    [sw.lng(), sw.lat()],
                    [ne.lng(), sw.lat()],
                    [ne.lng(), ne.lat()],
                    [sw.lng(), ne.lat()],
                    [sw.lng(), sw.lat()],
                ],
            ],
        };

        BoundTimeout = setTimeout(async () => {
            await getPoints();
            BuildHeatmap();
        }, 1000);
    }

    /**
     * Preload the Points. Ideally this would be on an event, state, or tick
     * system but for symplicity we're just waiting for the results.
     */
    await getPoints();

    // Generate the Map.
    generateGoogleMap();
})();

The above sample doesn't take into account the 100mi radius, but I'd recommend that be transformed on the server. Another idea would be to use web sockets - which can be faster if you expect your users to leave the window open as a dashboard (for example). The above example also does not take into account libraries, DOM, or loading order as will not work as a simple copy and paste, just providing it as an example.

发布评论

评论列表(0)

  1. 暂无评论