Posts: 2
Threads: 1
Joined: Feb 2014
I generate pages with xml files. However when the new page is generated it is not visible in the front end. If I login into admin the page becomes just fine.
This behavior started since I've upgraded to the latest version on my website
manestate.com
Anybody can help explain this? and how to fix this? I create new pages when a user register on the website.
Posts: 3,491
Threads: 106
Joined: Mar 2010
You should delete data/other/pages.xml after creating a page, so that the index/cache file is refreshed.
Posts: 6,266
Threads: 181
Joined: Sep 2011
We no longer constantly update the cache against files, since file editing is not a supported way of editing files and is a performance issue. This was actually a bug that was fixed.
If you want, you could use my component hook plugin or write a actual plugin to force the cache to regen on front end.
You have to refresh the pages.php if you are manually editing pages. This might be a problem if you have i18n installed also, since it replaces pages. Actually this might not even work either, since it only updates if the file count changes.
see
https://github.com/GetSimpleCMS/GetSimpleCMS/issues/560
Do you need help with this ?
I would actually do this in a component that is called on a special page via template or dynpages component, you can refresh this page when you need on the front end. Or use a hook and do an ip check, or usr cookie, query string check or whatever.
for starters
Posts: 6,266
Threads: 181
Joined: Sep 2011
nevermind i just reread your post, if you just want to detect new pages.
you can do
And it will still do a file check but it wont have to reload all pages unless the count changed.
A dir scan is still wasteful on every hit, so i would add some custom cookie detection or something so only you trigger it, or use a custom template page as i mentioned before, or if you are using some ftp process, just delete the page cache like carlos says.
Posts: 6,266
Threads: 181
Joined: Sep 2011
actually this is a two-parter, particular issue looks like is from a security fix to prevent browsing files not in pagecache such as uploaded pages or dir traversal attempts on slugs and prevent doing file exist checks for every bad request.
Posts: 2
Threads: 1
Joined: Feb 2014
Thank you. The easiest for me is just to delete the cache file pages.xml. That is good enough.