Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
General question about GS SEO Blogs Posts Pages Robots and Sitemaps
#1
I have a site to support one of my other professions set up with GS. Four Pages: Home - Services - News - Contact and the News page being a kind of blog with the News Manager plugin. You can see it here.

When I check what google has indexed by putting site:www.domain.com in the search box I see it has indexed the four pages. GS has put the four pages in sitemap.xml too. But there are only the last ten posts showing on the news page (enough I think for slow connections) while I have thirty quality posts saved on NM. All thirty posts could and should be generating pageviews and backlinks along with new contacts and enquiries. really I should have thirty-four pages in my sitemap and thirty-four pages being crawled and indexed.

So How do I manage this? Does another blog plugin do things differently? Or should I really be using google blogger and putting links and a feed to my GS site. I am pretty sure old blogger posts still attract Search Engine hits. Or is that just going to look like duplicated content and count against me? Or is it just a matter of getting decent sitemaps, xml and html on my GS site? Or putting all 30 pages in the Nav menu manually with categories and fly-out menus?. sounds like a lot of work, but maybe a News Manager isn't what I need any more. In another year I might have 60 different texts/readings/pages and might need a more serious system of tags searches and indexes. How would I do that?

Any thoughts?
Reply


Messages In This Thread
General question about GS SEO Blogs Posts Pages Robots and Sitemaps - by Timbow - 2013-04-16, 23:15:48



Users browsing this thread: 1 Guest(s)