Cache Google Map API Calls: Reducing Cost

If you are spending a lot of money on Google API, there are a few things you can do to reduce your cost.


For some context, this particular project uses Google Maps API for location tracking, distance estimates (ETA) and best route estimates. If you do any of these things, you'll likely use Geocoding, Distance Matrix and Direction Matrix. or if you have to check places like congestion zones, Dartford Crossing, and other places.


Unfortunately, for the common man, Google requires that you pay per API call, so if you make 500 calls to any of these endpoints, you pay for those calls. And that means the more calls you make, the more money you have to pay at the end of the month.


For any business, and more significantly for smaller ones, these calls can quickly become expensive, especially if you do not make direct or indirect revenue from the API calls.


In this project, a single user might use up to 25 calls while checking different pricing options; sometimes, such a customer might eventually buy from you or not. In actual sense, 95% of users who eventually hit your product might not buy after racking up several costs on Google API usage.


For perspective and for those who understand numbers better, if you get 100 customers to visit your product, if each customer uses up 25 calls searching various applicable routes and checking prices - that's 2,500 calls. If Google charges $0.1 per call, you'll be paying ±$250 for those 100 customers, from which only about 5 of them might buy, which means you'll likely be raking up debt on these customers.


These numbers are purely hypothetical, by the way. Although this post is based on a real-life situation, the numbers and many of the details are only representations to give some perspectives.


For this particular product, here is how we solved the problem:


1. In this case, customers trying different options usually get the estimates for the same address, which means the 25 calls made to the Google API will have the same longitude and latitude - hence, you could reduce the calls to API by caching the first response gotten from the calls. A cached response means if the same customer tries the calls again, the cached data will be used to get the distance estimate, and your product can do the rest of the calculation.


2. Spend on an in-house database. Certain things about addresses hardly change. One of the things we do in the product is to get the longitude and latitude of an address so that it can be sent for other details. The longitude and latitude of a location will likely never change. At the same time, this is a long shot. You can create a dedicated database to keep postcode, longitude, latitude and other details. Your codebase will then change:

  • You start by checking your cache to see if it already exists
  • You then check your db if you already have it; otherwise,
  • You can make the Google API call.

3. One more step we added to the above was to check for alternative free services that could replace Google. For example, in getting longitude and latitude, getting postcode based on address estimates or getting address suggestions, we could find alternative services that were close to free (but obviously cheaper than Google) or free. Then, our steps become:


  • Check cached data if it already exists
  • Check local db if it exists
  • Check free / cheaper services if it exists
  • As a last resort, go to Google API.

By doing this, we reduced costs by a significant amount. To track these things, though, we made sure to have a lot of tracking in place. For example, we know customers typically send the same set of data multiple times before deciding to buy, and we also know that there are things we could keep using (hence cache) and other analytical data that helped us to decide on these steps.


All of the above may not apply to you or your product. It gives an idea of how you can reduce costs.


if you want to discuss or have further suggestions, don't hesitate to contact me on Twitter.


Happy coding!