Hi morvy,
I think that there is nothing intrinsically wrong with using "data_index" filter to injecting your data.
Not sure if it helps, but you can also inject other page data into "data_index", such as url (slug) or meta description.
And yes, native GS rewrite rule is not very flexible to use it for dynamic or "fake" URL structures you're using. I always prefer to change this rule to something more flexible, such as:
This rule gives you definitively more control over the entire URL or path. Next, what you might do is to create a simple "Router" plugin, that could make it really easier for you to access the "URL segments" (I call "segments" the parts of a URL delimited by slashes /catalog/category/subcategory/etc).
Take a look at this example: it's better to install it as a plugin, so you can play with (Also be sure to change the rewrite rule like I pointed out before, otherwise it wouldn't work):
https://gist.github.com/bigin/0395fb4f3c...5cf8ba62b4
I didn't quite understand your second question, are you looking for a function to generate a sitemap or just an approach to trigger a sitemap-script when the web hook fires?
Hmmm... to create a sitemap I would probably go another way and generate the sitemap.xml dynamically, and only if it is called by the robots, instead of re-generating a physical file with every hook. However, for this you have to write your own script. Funny, then you could probably just extend the script from above a little bit to trigger the generator function with following code:
of course, the sitemap.xml file must not be physically located in the directory, because it will then preferred by this one rule:
Hope that this helps you a bit.
I think that there is nothing intrinsically wrong with using "data_index" filter to injecting your data.
Not sure if it helps, but you can also inject other page data into "data_index", such as url (slug) or meta description.
And yes, native GS rewrite rule is not very flexible to use it for dynamic or "fake" URL structures you're using. I always prefer to change this rule to something more flexible, such as:
Code:
RewriteRule ^(.*)$ index.php?id=$1 [L,QSA]
This rule gives you definitively more control over the entire URL or path. Next, what you might do is to create a simple "Router" plugin, that could make it really easier for you to access the "URL segments" (I call "segments" the parts of a URL delimited by slashes /catalog/category/subcategory/etc).
Take a look at this example: it's better to install it as a plugin, so you can play with (Also be sure to change the rewrite rule like I pointed out before, otherwise it wouldn't work):
https://gist.github.com/bigin/0395fb4f3c...5cf8ba62b4
I didn't quite understand your second question, are you looking for a function to generate a sitemap or just an approach to trigger a sitemap-script when the web hook fires?
Hmmm... to create a sitemap I would probably go another way and generate the sitemap.xml dynamically, and only if it is called by the robots, instead of re-generating a physical file with every hook. However, for this you have to write your own script. Funny, then you could probably just extend the script from above a little bit to trigger the generator function with following code:
Code:
if($input->urlSegments->get(0) == 'sitemap.xml') {
header("Content-Type: text/xml");
echo renderSitemapXML();
}
of course, the sitemap.xml file must not be physically located in the directory, because it will then preferred by this one rule:
Code:
RewriteCond %{REQUEST_FILENAME} !-f
Hope that this helps you a bit.