Use {$extra} to avoid duplicate in News and CGBlog?

The place to talk about things that are related to CMS Made simple, but don't fit anywhere else.
Post Reply
pwg
Forum Members
Forum Members
Posts: 191
Joined: Tue Aug 01, 2006 1:48 am

Use {$extra} to avoid duplicate in News and CGBlog?

Post by pwg »

Hi,

I have a client running a Christmas campaign, predominately on social media but with news posts and some duplicate copy into CGBlog (which I use as their recipe section). I would prefer the recipe content rank.

To avoid duplicate content penalties, I assign the extra field as a variable in the News detail template, and added

Code: Select all

{if isset($canonical)}<link rel='canonical' href='{if ! empty($extra)}{$extra}{else}{$canonical}{/if}'/>{elseif isset($content_obj)}<link rel='canonical' href='{$content_obj->GetURL()}'/>{/if}
To my head content, and just place the "correct" recipe URL into the Extra field on the News article. I hide the field with css.

Seems to work, but I'm wondering if this is the best way to do this?

Thanks for any advice. My client has taken a one day Google course - and a little knowledge is a very dangerous thing!

Cheers,

Paul
User avatar
velden
Dev Team Member
Dev Team Member
Posts: 3497
Joined: Mon Nov 28, 2011 9:29 am

Re: Use {$extra} to avoid duplicate in News and CGBlog?

Post by velden »

I'm no SEO expert, but does this mean you set a canonical url to a page (content) that actually NOT is the current page?

I bet that would be considered bad behavior.
I hide the field with css.
Why? Just don't print the field would be better I guess.
Or are you still using the template's default foreach loop? That should not be necessary.
pwg
Forum Members
Forum Members
Posts: 191
Joined: Tue Aug 01, 2006 1:48 am

Re: Use {$extra} to avoid duplicate in News and CGBlog?

Post by pwg »

Thanks velden, it does (meaning I'm pointing page a to page b).

I followed ideas from here: http://moz.com/learn/seo/duplicate-content.

I can easily add the page to robots.txt - perhaps that would be better?

I was using the default foreach, but have removed it now - thanks.

Obviously, I'm really no SEO expert!

Thanks,

Paul
User avatar
velden
Dev Team Member
Dev Team Member
Posts: 3497
Joined: Mon Nov 28, 2011 9:29 am

Re: Use {$extra} to avoid duplicate in News and CGBlog?

Post by velden »

Well maybe it's fine to use that method.
JohnnyB
Dev Team Member
Dev Team Member
Posts: 731
Joined: Tue Nov 21, 2006 5:05 pm

Re: Use {$extra} to avoid duplicate in News and CGBlog?

Post by JohnnyB »

{if isset($canonical)}<link rel='canonical' href='{if ! empty($extra)}{$extra}{else}{$canonical}{/if}'/>{elseif isset($content_obj)}<link rel='canonical' href='{$content_obj->GetURL()}'/>{/if}
I think that is a good way to do it. The main thing is to have only one canonical link for each article, regardless of how many links actually open the article. So, if your method is doing that, then it is good and creative.

The only drawback is having to manually add the URL into your $extra field :(
"The art of life lies in a constant readjustment to our surroundings." -Okakura Kakuzo

--
LinkedIn profile
--
I only speak/write in English so I may not translate well on International posts.
--
pwg
Forum Members
Forum Members
Posts: 191
Joined: Tue Aug 01, 2006 1:48 am

Re: Use {$extra} to avoid duplicate in News and CGBlog?

Post by pwg »

Thanks JohnnyB, my concern is that one of the articles is coming from the News module and the other CGBlog.
They just both have virtually the same content. Perhaps not logical or ideal, but how the client wants it.

I'm still wondering if I should just add the "unwanted" article to robots.txt?

cheers,

Paul
JohnnyB
Dev Team Member
Dev Team Member
Posts: 731
Joined: Tue Nov 21, 2006 5:05 pm

Re: Use {$extra} to avoid duplicate in News and CGBlog?

Post by JohnnyB »

They just both have virtually the same content. Perhaps not logical or ideal, but how the client wants it.

I'm still wondering if I should just add the "unwanted" article to robots.txt?
You can add the path to the unwanted ones in Robots.txt using a wildcard for the path. And/or set a noindex meta tag in the detail page's metadata and google et al will not index those. It might be easier than setting a canonical for each entry but they won't be indexed at all whereas the your original solution will at least get those indexed.
"The art of life lies in a constant readjustment to our surroundings." -Okakura Kakuzo

--
LinkedIn profile
--
I only speak/write in English so I may not translate well on International posts.
--
pwg
Forum Members
Forum Members
Posts: 191
Joined: Tue Aug 01, 2006 1:48 am

Re: Use {$extra} to avoid duplicate in News and CGBlog?

Post by pwg »

Thanks JohnnyB, appreciate the help/advice.

cheers,

Paul
Post Reply

Return to “The Lounge”