您需要 登录 才可以下载或查看，没有帐号？立即注册
Google Penguin looks mostly at your link source, says Google
Another nugget of information learned from the A conversation with Google’s Gary Illyes (part 1) podcast at Marketing Land, our sister site was thatPenguin is coined a “web spam” algorithm but it indeed focuses mostly on “link spam.” Google has continually told webmasters that this is a web spam algorithm, but every webmaster and SEO focuses mostly around links. Google’s Gary Illyes said their focus is right, that they should be mostly concerned with the links when tackling Penguin issues.
Gary Illyes made a point to clarify that it isn’t just the link but rather the “source site” the link is coming from. Google said Penguin is based on “the source site, not on the target site.” You want your links to come from quality sources as opposed to low quality source.
One example Gary revealed was him looking at a negative SEO case submitted to him and he said the majority of the links were on “empty profile pages, forum profile pages.” When he looked at those links, the new Penguin algorithm was already “discounting” those links, devaluing those links .
“The good thing is that it is discounting the links, basically ignoring the links instead of the demoting,” Gary Illyes added.
Here is the audio snippet:
Here is the transcript:
Barry Schwartz: You also talked about web spam versus link spam and that Penguin. I know John Mueller specifically called it out again, in the original Penguin blog post that you had posted, that you said this is specifically a web spam algorithm. But every SEO that I know focuses just on link spam regarding Penguin. And I know when you initially started talking about this on our podcast just now, you said it’s mostly around really really bad links. Is that accurate to say when you talk about Penguin typically it’s around really really bad links and not other types of web spam?
Gary Illyes: It looks at… It’s not just links. It looks a bunch of different things related to the source site. Links is just the most visible thing and the one that we decided to talk most about because we already talked about about links in general.
But it looks at different things on the source site, not on the target site and then makes decisions based on those special signals.
I don’t actually want to reveal more of those spam signals because I think they would be pretty, I wouldn’t say easy to spam, but they would easy to mess with. And I really don’t want that.
But there are quite a few hints in the original, the old Penguin article.
Barry Schwartz: Can you mentioned one of those hints, that is in the article?
Gary Illyes: I would rather not. I know that you can make pretty good assumptions. So I would just let you make assumptions.
Danny Sullivan: If you were making assumptions. How would you make those assumptions?
Gary Illyes: I try not to make assumptions. I try to to make decisions based on data.
Barry Schwartz: Should we be focusing on a link spam aspect of it for Penguin? Obviously focus on all the “make best qualities sites” yada yada yada but we talk about Penguin as reporters and we’re telling people that SEOs are like Penguins specialists or something like that, they only focus on link spam that wrong? I mean should they.
Gary Illyes: I think what that’s that’s the the main thing that they should focus on.
See where it is coming from and then make a decision based on the source site wether they want that link or not.
Well for example, like I was looking at the negative SEO case just the yesterday or two days ago. And basically the content owner played hundreds of links on an empty profile pages, forum profile pages. Those links with the new Penguin were discounted. But like if you looked at the page, it was pretty obvious that the links were placed there for up for a very specific reason and that’s to game the ranking algorithms. But not just Google’s but any other ranking algorithm that uses links. Like if you at a page, you can make a pretty easy decision on whether disavow or remove that link or not. And that’s what Penguin is doing. It’s looking at signals on the source page. Basically what kind of page it is, what’s what could be the purpose of that link and then make a decision based on that wether to discount those things or not.
The good thing is that it is discounting the links, basically ignoring the links instead of the demoting.
So in general unless people are overdoing it it’s unlikely that they will actually feel any sort of effect by placing those. But again if they are overdoing it then the manual actions team might take a deeper look.
You can listen to part one of the interview at Marketing Land.
上一篇：SearchCap: Google Penguin links, link labels & RGB to Hex answers