Showing posts with label Matt Cutts. Show all posts
Showing posts with label Matt Cutts. Show all posts

Matt Cutts Finally Announces Link Disavow Tool For Google Webmaster Tools

Labels: , , , 1 comments

After months of anticipation, Google’s Matt Cutts, at PubCon in Las Vegas today, finally announced a new tool in Webmaster Tools to disavow links.
Cutts made comments at SMX Advanced back in July, indicating that a tool would be on the way, and it is now here.
In text on the tool itself, Google says, “If you believe your site’s ranking is being harmed by low-quality links you do not control, you can ask Google not to take them into account when assessing your site.”
Here is Cutts talking about it in a new Webmaster Help video:

“You might have been doing blog spam, comment spam, forum spam, guestbook spam…maybe you paid somebody to write some low quality articles and syndicate those all over the place with some very keyword rich anchor text, and maybe Google sent you a message that says, ‘We’ve seen unnatural links to your site or we’ve taken targeted action on some of the unnatural links to your site,’ and so as a result, you want to clean up those backlinks,” Cutts says in the video.
First and foremost, he says, they recommend getting those links actually removed from the web. Of course, that’s easier said than done.
Google says in a help center article:
PageRank is Google’s opinion of the importance of a page based on the incoming links from other sites. (PageRank is an important signal, but it’s one of more than 200 that we use to determine relevancy.) In general, a link from a site is regarded as a vote for the quality of your site.
Google works very hard to make sure that actions on third-party sites do not negatively affect a website. In some circumstances, incoming links can affect Google’s opinion of a page or site. For example, you or a search engine optimizer (SEO) you’ve hired may have built bad links to your site via paid links or other link schemes that violate our quality guidelines. First and foremost, we recommend that you remove as many spammy or low-quality links from the web as possible.
If you’ve done as much work as you can to remove spammy or low-quality links from the web, and are unable to make further progress on getting the links taken down, you can disavow the remaining links. In other words, you can ask Google not to take certain links into account when assessing your site.
Update: Google has now put out an official blog post about the tool. In that, Webmaster Trends Analyst Jonathan Simon writes:
If you’ve ever been caught up in linkspam, you may have seen a message in Webmaster Tools about “unnatural links” pointing to your site. We send you this message when we see evidence of paid links, link exchanges, or other link schemes that violate our quality guidelines. If you get this message, we recommend that you remove from the web as many spammy or low-quality links to your site as possible. This is the best approach because it addresses the problem at the root. By removing the bad links directly, you’re helping to prevent Google (and other search engines) from taking action again in the future. You’re also helping to protect your site’s image, since people will no longer find spammy links pointing to your site on the web and jump to conclusions about your website or business.
If you’ve done as much as you can to remove the problematic links, and there are still some links you just can’t seem to get down, that’s a good time to visit our new Disavow links page. When you arrive, you’ll first select your site.
According to a liveblogged account of Cutts’ speech, he says not to use the tool unless you’re sure you need to use it. He mentioned that Google, going forward, will be sending out more messages about examples of links Google is distrusting. He also says not to disavow links from your own site.
Regarding those link messages, Cutts says in the video that these are only examples of links, and not a comprehensive list.
The tool consists of a .txt file (disavow.txt), with one URL per line that tells Google to ignore the site. You can also use it to block a whole domain by using a format like: domain:www.example.com.
Cutts apparently suggests that most sites not use the tool, and that it is still in the early stages. Given that link juice is a significant ranking signal for Google it’s easy to see why Google wouldn’t want the tool to be over-used.
It can reportedly take weeks for Google to actually disavow links. In a Q/A session, according to the liveblog from Search Engine Roundtable, Cutts said you should wait 2-3 days before sending a reconsideration request after you submit a disavow file. When asked if it hurts your site when someone disavows links from it, he reportedly said that it typically does not, as they look at your site as a whole.
Danny Sullivan blogs that “Google reserves the right not to use the submissions if it feels there’s a reason not to trust them.”
Users will be able to download the files they submitted, and submit it again later with any changes. According to Sullivan’s account, Cutts said the tool is like using the “nofollow” attribute in that it allows sites to link to others without passing PageRank.
That’s good to know.
A lot of SEOs have been waiting for Google to launch something like this for a long time. Perhaps it will cut down on all of the trouble webmasters have been going through trying to get other sites to remove links. At the same time, we also have to wonder how much overreaction there will be from webmasters who end up telling Google to ignore too many links, and shooting themselves in the foot. This will be a different era, to say the least.
Just be warned. Google’s official word of caution is: ” If used incorrectly, this feature can potentially harm your site’s performance in Google’s search results. We recommend that you disavow backlinks only if you believe you have a considerable number of spammy, artificial, or low-quality links pointing to your site, and if you are confident that the links are causing issues for you. In most cases, Google can assess which links to trust without additional guidance, so most normal or typical sites will not need to use this tool.”
The information Google uses from the tool will be incorporated into its index as it recrawls the web and reprocesses the pages it sees.
Google currently supports one disavow file per site. That file is shared among site owners in Webmaster Tools. The file size limit is 2MB.

Google Launches New Page Layout Update (Yes, ANOTHER Update)

Labels: , , , , 0 comments

Google is on a roll with these updates. I think webmasters are starting to understand what Google’s Matt Cutts meant when he said a while back that updates would start getting “jarring and jolting”. It seems, that rather than one major update, we’re getting a bunch of updates in a short amount of time. This past Friday, Google launched its latest Penguin refresh. A week before that, it was the EMD update and a new Panda update.
Tuesday, Cutts tweeted about a Page Layout update:


The Page Layout update was first announced early this year, months before we ever saw the first Penguin update. It’s sometimes referred to as the “above the fold” update. It was designed to target pages that lack content above the fold. At the time, Cutts wrote in a blog post:
As we’ve mentioned previously, we’ve heard complaints from users that if they click on a result and it’s difficult to find the actual content, they aren’t happy with the experience. Rather than scrolling down the page past a slew of ads, users want to see content right away. So sites that don’t have much content “above-the-fold” can be affected by this change. If you click on a website and the part of the website you see first either doesn’t have a lot of visible content above-the-fold or dedicates a large fraction of the site’s initial screen real estate to ads, that’s not a very good user experience. Such sites may not rank as highly going forward.
We understand that placing ads above-the-fold is quite common for many websites; these ads often perform well and help publishers monetize online content. This algorithmic change does not affect sites who place ads above-the-fold to a normal degree, but affects sites that go much further to load the top of the page with ads to an excessive degree or that make it hard to find the actual original content on the page. This new algorithmic improvement tends to impact sites where there is only a small amount of visible content above-the-fold or relevant content is persistently pushed down by large blocks of ads.
It looks like Christmas has come early for webmasters this year. Although, on that note, this could be a sign that Google is getting all of this stuff out of the way before the holiday season, so they don’t mess too much with your rankings during this crucial time of year for ecommerce. They’ve shown in the past that they’ve learned from the infamous Florida update.

Google Penguin Update Gets A New Data Refresh

Labels: , , , 1 comments

Oh that Google and their late Saturday 6th Oct announcements. Sometimes it’s the big monthly (at least they used to be) “search quality highlights” lists, but they were kind enough to release that on Thursday this past week. Still, Google’s Matt Cutts managed to sneak in a Penguin announcement on Friday. He tweeted:


He followed that up with:



Cutts has made comments in the past indicating that this update could be “jarring”. Are you seeing the effects? It’s been quite a week for Google updates. The Friday before this announcement, Cutts announced the EMD update, and later noted that there was also a Panda update rolling out. More on all of this here. I’m sure we’ll be discussing the Penguin update more in the coming week.

Matt Cutts Just Announced A Google Algorithm Change

Labels: , , , 1 comments

says it will reduce low-quality “exact-match” domains in search results. It sounds like an extension of the last change he tweeted about, which was aimed at improving domain diversity. Here’s the new tweet:

Update: Cutts tweeted a follow-up:

Probably good of him to clear that up right away.
Google is about due to publish its big list of algorithm changes for the months of August and September. When that happens, it will be interesting to see how many entries are related to domains. It seems like there are typically visible themes in the lists. For example, in the June list, there were a lot of changes related to improving how Google deals with natural language.
Have you seen any effects from this update? Let us know with your comments.

Google Launched An Update This Week To Improve Domain Diversity

Labels: , , , , 1 comments

Google launched an algorithm update that affects the diversity of search results. Google’s head of webspam and Distinguished Engineer, tweeted:

Matt Cutts
@mattcutts

Just fyi, we rolled out a small algo change this week that improves the diversity of search results in terms of different domains returned.

Reply · Retweet · Favorite
20 minutes ago via web · powered by @socialditto

There have been complaints in recent weeks about Google showing search results pages with a lot of results from the same domain for a lot of queries. Presumably that will be better now, and users will get a more diverse set of results in more cases. Or maybe it’s just about spreading the love among more domains in general (and not just per page).

That’s as much as we know about the update for now, but it’ll be interesting to see if the change is noticeable on a day to day basis.

There has been talk from webmasters that there may have been a new Panda update this week. We’ve not heard from Google on that front, and it’s unclear at this point whether this could have been the change people were noticing.

Google’s big list of algorithm changes for the month of August is due out any time now, and when it’s released, we’ll get more insight into the direction Google is going on, and its core areas of focus in recent weeks. Stay tuned.

 
Internet Marketing Expert, SEO Latest Google Updates - Naveen Kumar © 2012 | Designed by Meingames and Bubble shooter