Caches
Learn more about cache monitoring with Sentry and how it can help improve your application's performance.
This view is only available on Team and Business plans. On Team plans, you won't be able to query data that's more than 7 days old. To enable querying for all your data, upgrade to the Business or Enterprise plan.
A cache can be used to speed up data retrieval, thereby improving application performance. A cache stores data temporarily to speed up subsequent access to that data. This allows your application to get data from cached memory (if it is available) instead having to download that data again or retrieving it from a potentially slow data layer. Caching can speed up read-heavy workloads for applications like Q&A portals, gaming, media sharing, and social networking.
A successful cache results in a high hit rate which means the data was present when fetched. A cache miss occurs when the data fetched was not present in the cache. If you have performance monitoring enabled and your application uses caching, you can see how your caches are performing with Sentry.
Sentry's cache monitoring provides insights into cache utilization and latency so that you can improve the performance of the endpoints that interact with caches.
Starting with the Cache page, you get an overview of the transactions within your application that are making at least one lookup against a cache. From there, you can dig into specific cache span operations by clicking a transaction and viewing its sample list.
Cache monitoring currently supports auto instrumentation for Django's cache framework when the cache_spans option is set to True
. Other frameworks require custom instrumentation.
If available, custom instrumentation is documented on an environment-by-environment basis as listed below:
To see what cache data can be set on spans, see the Cache Module Developer Specification.
Our documentation is open source and available on GitHub. Your contributions are welcome, whether fixing a typo (drat!) or suggesting an update ("yeah, this would be better").