By exploiting inconsistencies between a content delivery network (CDN) or cache server (CF) and the origin web server, adversaries can trick the cache into storing private endpoints under static asset rules—then retrieve them at will.
CDNs often differentiate between static assets (images, CSS, JavaScript) and dynamic pages. Static files usually carry permissive headers such as Cache-Control: public and generous max-age directives, while dynamic endpoints use no-store or private.
However, when a URL path mimics a static resource—by appending file extensions or delimiters—the CDN may cache it, whereas the origin will still treat it as a dynamic page.
| Directive | CDN Behavior | Origin Behavior |
|---|---|---|
| Cache-Control: public | Caches response at CDN and browser cache | No special handling |
| Cache-Control: private | Browser-only cache; no shared cache | No special handling |
| Cache-Control: no-store | No caching anywhere | No caching anywhere |
Extension mapping (e.g., .css) | Treated as static asset; cached | Served dynamically; no caching |
| Path normalization | May ignore traversal sequences (../) | Strict normalization then resolve |
/user/profile.html.css—the CDN caches the endpoint under its static rules, but the origin still processes it as /user/profile.html.css, exposing HTML responses via the cache.;) or URL fragments (#) are inconsistently handled. A request to /account;123%2Fsettings.css can be cached under /account;123/settings.css by the CDN, while the origin strips or normalizes the path differently, serving private pages.%2E%2E%2F bypass CDN normalization, tricking the cache into believing the resource sits under a cacheable directory. Meanwhile, the backend sees the true path and returns sensitive content.In a PortSwigger lab, the attacker found that /robots.txt was cached with max-age=30. By issuing:
textGET /robots.txt%2F HTTP/2
Host: vulnerable.example.com
The CDN cached it (because it saw /robots.txt/ under the static rule), yet the origin server rejected other variants. Ultimately, the payload:
textGET /my-account;%2F%2E%2E%2Frobots.txt?secret HTTP/2
Host: vulnerable.example.com
forced the cache to store the sensitive robots.txt Under account context, then allowed the retrieval of user account data.
Cache-Control: no-store at all layers.By adhering to consistent path handling and strict header policies, organizations can eliminate the gap that web cache deception relies upon, safeguarding dynamic content from unintended exposure.
Find this Story Interesting! Follow us on LinkedIn and X to Get More Instant Updates
The post New Cache Deception Attack Exploits Mismatch Between Cache and Web Server appeared first on Cyber Security News.
Cybersecurity researchers have uncovered several malware campaigns targeting gamers who search for free game cheats…
Warner selectman Michael Smith was charged with tampering with public records after he replaced a…
A group of taxpayers who sued the state in 2022, challenging its school funding system,…
A Merrimack Superior Court judge denied bail Wednesday for a therapist accused of sexually assaulting…
Nine of the 10 highest-paid state employees work in New Hampshire prisons. Last year, those…
FREEPORT, Ill. (WTVO) — The city of Freeport is in the process of demolishing One…
This website uses cookies.