|
Post by MimJannat99 on Nov 9, 2023 1:49:09 GMT -5
So let's compare scenarios: Caching without a robot If the robot function is disabled, the entire website caching process is initiated by the user. The site's cache remains empty until the user visits the site for the first time or, alternatively, submits a request. caching without robot What happens the first time a user visits? The aforementioned caching! The first time a user visits, the server receives the request and calls the PHP code to generate a static page. This page is then displayed to the user and cached for next time. Worth knowing. Thanks to the use of caching, server resources are saved because on subsequent visits, the server displays the previously saved page without the need to call the PHP code again. What does photo editor this mean in practice? On the first visit of the user, the page loading time will be extended. Only after saving the page in the cache will this time be drastically reduced. This means that after refreshing the cache, . after making changes to the content, the entire process will start over and the first page load will be delayed. Buffering with robot Now let's look at what happens when the page is scanned and the cache is refreshed by the LSCache crawler. caching with lscache robot What happens when the crawler visits a specific subpage.
|
|