Confirmed: A Panda Refresh, Version #23

Labels: , , 0 comments

Update: Google has confirmed that this is a Panda refresh. This would make it version 23 and it impacted about 1.3% of English based queries.

I should note, I am shocked they pushed out a refresh before the holiday seasons. Last year they promised they would not do a Panda update before the holidays. What a difference a year makes.

A week ago, we reported on some very strong signs of a Google Update last Thursday. Google told us there was no update, not Panda and not anything. Webmasters feel Google is either lying or are clueless

That being said, there are now new reports in the same WebmasterWorld thread of an update hitting a few hours ago in the Google search results.

In the WebmasterWorld thread, here are some of the posts: Looked like something got rolled back to me. My positions have all reverted to where they were last week.
There has been some shuffling overnight (in which my main key term page has gone from page 2 to page 4).

Now it is really early and the reporting tools have yet to report as of this morning, including MozCast, SERPmetrics and SERPS.com. Although, since this seems to have been updating early this morning, the tools may not show anything until tomorrow.

Google has confirmed a Panda refresh impacting 1.3% of queries!

Matt Cutts Finally Announces Link Disavow Tool For Google Webmaster Tools

Labels: , , , 1 comments

After months of anticipation, Google’s Matt Cutts, at PubCon in Las Vegas today, finally announced a new tool in Webmaster Tools to disavow links.
Cutts made comments at SMX Advanced back in July, indicating that a tool would be on the way, and it is now here.
In text on the tool itself, Google says, “If you believe your site’s ranking is being harmed by low-quality links you do not control, you can ask Google not to take them into account when assessing your site.”
Here is Cutts talking about it in a new Webmaster Help video:

“You might have been doing blog spam, comment spam, forum spam, guestbook spam…maybe you paid somebody to write some low quality articles and syndicate those all over the place with some very keyword rich anchor text, and maybe Google sent you a message that says, ‘We’ve seen unnatural links to your site or we’ve taken targeted action on some of the unnatural links to your site,’ and so as a result, you want to clean up those backlinks,” Cutts says in the video.
First and foremost, he says, they recommend getting those links actually removed from the web. Of course, that’s easier said than done.
Google says in a help center article:
PageRank is Google’s opinion of the importance of a page based on the incoming links from other sites. (PageRank is an important signal, but it’s one of more than 200 that we use to determine relevancy.) In general, a link from a site is regarded as a vote for the quality of your site.
Google works very hard to make sure that actions on third-party sites do not negatively affect a website. In some circumstances, incoming links can affect Google’s opinion of a page or site. For example, you or a search engine optimizer (SEO) you’ve hired may have built bad links to your site via paid links or other link schemes that violate our quality guidelines. First and foremost, we recommend that you remove as many spammy or low-quality links from the web as possible.
If you’ve done as much work as you can to remove spammy or low-quality links from the web, and are unable to make further progress on getting the links taken down, you can disavow the remaining links. In other words, you can ask Google not to take certain links into account when assessing your site.
Update: Google has now put out an official blog post about the tool. In that, Webmaster Trends Analyst Jonathan Simon writes:
If you’ve ever been caught up in linkspam, you may have seen a message in Webmaster Tools about “unnatural links” pointing to your site. We send you this message when we see evidence of paid links, link exchanges, or other link schemes that violate our quality guidelines. If you get this message, we recommend that you remove from the web as many spammy or low-quality links to your site as possible. This is the best approach because it addresses the problem at the root. By removing the bad links directly, you’re helping to prevent Google (and other search engines) from taking action again in the future. You’re also helping to protect your site’s image, since people will no longer find spammy links pointing to your site on the web and jump to conclusions about your website or business.
If you’ve done as much as you can to remove the problematic links, and there are still some links you just can’t seem to get down, that’s a good time to visit our new Disavow links page. When you arrive, you’ll first select your site.
According to a liveblogged account of Cutts’ speech, he says not to use the tool unless you’re sure you need to use it. He mentioned that Google, going forward, will be sending out more messages about examples of links Google is distrusting. He also says not to disavow links from your own site.
Regarding those link messages, Cutts says in the video that these are only examples of links, and not a comprehensive list.
The tool consists of a .txt file (disavow.txt), with one URL per line that tells Google to ignore the site. You can also use it to block a whole domain by using a format like: domain:www.example.com.
Cutts apparently suggests that most sites not use the tool, and that it is still in the early stages. Given that link juice is a significant ranking signal for Google it’s easy to see why Google wouldn’t want the tool to be over-used.
It can reportedly take weeks for Google to actually disavow links. In a Q/A session, according to the liveblog from Search Engine Roundtable, Cutts said you should wait 2-3 days before sending a reconsideration request after you submit a disavow file. When asked if it hurts your site when someone disavows links from it, he reportedly said that it typically does not, as they look at your site as a whole.
Danny Sullivan blogs that “Google reserves the right not to use the submissions if it feels there’s a reason not to trust them.”
Users will be able to download the files they submitted, and submit it again later with any changes. According to Sullivan’s account, Cutts said the tool is like using the “nofollow” attribute in that it allows sites to link to others without passing PageRank.
That’s good to know.
A lot of SEOs have been waiting for Google to launch something like this for a long time. Perhaps it will cut down on all of the trouble webmasters have been going through trying to get other sites to remove links. At the same time, we also have to wonder how much overreaction there will be from webmasters who end up telling Google to ignore too many links, and shooting themselves in the foot. This will be a different era, to say the least.
Just be warned. Google’s official word of caution is: ” If used incorrectly, this feature can potentially harm your site’s performance in Google’s search results. We recommend that you disavow backlinks only if you believe you have a considerable number of spammy, artificial, or low-quality links pointing to your site, and if you are confident that the links are causing issues for you. In most cases, Google can assess which links to trust without additional guidance, so most normal or typical sites will not need to use this tool.”
The information Google uses from the tool will be incorporated into its index as it recrawls the web and reprocesses the pages it sees.
Google currently supports one disavow file per site. That file is shared among site owners in Webmaster Tools. The file size limit is 2MB.

Google Launches New Page Layout Update (Yes, ANOTHER Update)

Labels: , , , , 0 comments

Google is on a roll with these updates. I think webmasters are starting to understand what Google’s Matt Cutts meant when he said a while back that updates would start getting “jarring and jolting”. It seems, that rather than one major update, we’re getting a bunch of updates in a short amount of time. This past Friday, Google launched its latest Penguin refresh. A week before that, it was the EMD update and a new Panda update.
Tuesday, Cutts tweeted about a Page Layout update:


The Page Layout update was first announced early this year, months before we ever saw the first Penguin update. It’s sometimes referred to as the “above the fold” update. It was designed to target pages that lack content above the fold. At the time, Cutts wrote in a blog post:
As we’ve mentioned previously, we’ve heard complaints from users that if they click on a result and it’s difficult to find the actual content, they aren’t happy with the experience. Rather than scrolling down the page past a slew of ads, users want to see content right away. So sites that don’t have much content “above-the-fold” can be affected by this change. If you click on a website and the part of the website you see first either doesn’t have a lot of visible content above-the-fold or dedicates a large fraction of the site’s initial screen real estate to ads, that’s not a very good user experience. Such sites may not rank as highly going forward.
We understand that placing ads above-the-fold is quite common for many websites; these ads often perform well and help publishers monetize online content. This algorithmic change does not affect sites who place ads above-the-fold to a normal degree, but affects sites that go much further to load the top of the page with ads to an excessive degree or that make it hard to find the actual original content on the page. This new algorithmic improvement tends to impact sites where there is only a small amount of visible content above-the-fold or relevant content is persistently pushed down by large blocks of ads.
It looks like Christmas has come early for webmasters this year. Although, on that note, this could be a sign that Google is getting all of this stuff out of the way before the holiday season, so they don’t mess too much with your rankings during this crucial time of year for ecommerce. They’ve shown in the past that they’ve learned from the infamous Florida update.

Google Penguin Update Gets A New Data Refresh

Labels: , , , 1 comments

Oh that Google and their late Saturday 6th Oct announcements. Sometimes it’s the big monthly (at least they used to be) “search quality highlights” lists, but they were kind enough to release that on Thursday this past week. Still, Google’s Matt Cutts managed to sneak in a Penguin announcement on Friday. He tweeted:


He followed that up with:



Cutts has made comments in the past indicating that this update could be “jarring”. Are you seeing the effects? It’s been quite a week for Google updates. The Friday before this announcement, Cutts announced the EMD update, and later noted that there was also a Panda update rolling out. More on all of this here. I’m sure we’ll be discussing the Penguin update more in the coming week.

Google: By The Way, A Panda Update Is Rolling Out Alongside The EMD Update

Labels: , , , 0 comments

Last Friday, Google announced the EMD update. It was billed as a small and minor update, but the effects seemed to be fairly large, with many webmasters claiming to have been hit. Google’s Matt Cutts made it a point to say that the algorithm change was unrelated to both Panda and Penguin.
He then said it was not the only update that was rolling out during that timeframe, noting that Google makes changes every day (over 500 a year). He didn’t happen to mention that there was a new Panda update, however. Finally, he has dropped the news that there was indeed a Panda update going on at the same time as the EMD update (and it’s still rolling out).
Were you impacted by one of these updates? Are you able to discern which one it was? Let us know in the comments.
Search Engine Land reports that Google released a Panda algorithm update (not a data refresh, but an actual update) on Thursday, and that it impacts 2.4% of English search queries (and is still rolling out). That’s significantly larger than the 0.6% of English-US queries Cutts said the EMD update affected. So, it seems that the majority of those claiming to be hit by the EMD update were likely hit by Panda (which would explain those claiming to be hit, that didn’t have exact match domains).
Here’s the exact statement from Cutts that the publication is sharing: “Google began rolling out a new update of Panda on Thursday, 9/27. This is actually a Panda algorithm update, not just a data update. A lot of the most-visible differences went live Thursday 9/27, but the full rollout is baking into our index and that process will continue for another 3-4 days or so. This update affects about 2.4% of English queries to a degree that a regular user might notice, with a smaller impact in other languages (0.5% in French and Spanish, for example).”
Couldn’t he have just said that in the first place? Google had to know the confusion this would cause. Since the original Panda update, Google has made more of an effort to be transparent about algorithm changes, and it certainly has been. It seems, however, like delayed transparency is becoming the trend recently.
For months, Google was releasing monthly lists of updates that had been made the prior month. The last time, they left people waiting before finally posting a giant list for two months’ worth of changes. It seems that Google is doing this again, as we have yet to see lists for August or September (assuming Google is about to release these lists).
Either way, it appears the Panda continues to wreak havoc on webmasters. Wait until they get a load of the next Penguin.
For those sites that were hit, obviously if there is not an exact match domain involved, that makes the problem a little easier to figure out, at least in terms of which update the site was actually hit by. It seems unlikely that the EMD update would have done much to impact you if your site does not use an EMD. Which leaves Panda (and of course, any other updates that Google hasn’t told us about – they do make changes every day, and often more than one in a day).
While Cutts said that the EMD update is unrelated to Panda, that is not necessarily the case, depending on how you view the comment. Algorithmically speaking, I presume Cutts means the two have nothing to do with each other. However, in concept, the two are very similar in that they go after low quality. So, doesn’t it stand to reason that if you improve the quality of your content, you could actually recover from either update? That is assuming that the EMD update is one that can be recovered from. Let’s put it this way: if it’s possible to recover from the EMD update (which most likely it probably is), improving the quality of your site and content should be the main objective.
This just happens to be the same objective for recovering from Panda. Of course quality is subjective, and Google has it’s own view of what this entails. Luckily for webmasters Google has essentially laid out exactly what it is looking for from content, specifically with regards to the Panda update.
Googe has pretty much given webmaster the rules of the road to Panda recovery, even if they’re not official rules. You’ve probably seen the list before, but if you were never hit by the Panda update until now, maybe you haven’t. Either way, here are the questions Google listed last year as “questions that one could use to assess the quality of a page or an article:
  • Would you trust the information presented in this article?
  • Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
  • Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
  • Would you be comfortable giving your credit card information to this site?
  • Does this article have spelling, stylistic, or factual errors?
  • Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?
  • Does the article provide original content or information, original reporting, original research, or original analysis?
  • Does the page provide substantial value when compared to other pages in search results?
  • How much quality control is done on content?
  • Does the article describe both sides of a story?
  • Is the site a recognized authority on its topic?
  • Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites, so that individual pages or sites don’t get as much attention or care?
  • Was the article edited well, or does it appear sloppy or hastily produced?
  • For a health related query, would you trust information from this site?
  • Would you recognize this site as an authoritative source when mentioned by name?
  • Does this article provide a complete or comprehensive description of the topic?
  • Does this article contain insightful analysis or interesting information that is beyond obvious?
  • Is this the sort of page you’d want to bookmark, share with a friend, or recommend?
  • Does this article have an excessive amount of ads that distract from or interfere with the main content?
  • Would you expect to see this article in a printed magazine, encyclopedia or book?
  • Are the articles short, unsubstantial, or otherwise lacking in helpful specifics?
  • Are the pages produced with great care and attention to detail vs. less attention to detail?
  • Would users complain when they see pages from this site?
Of course, Google uses over 200 signals in all, but that should get you started on thinking about your site’s content.
And with regards to the EMD update, remember, Google is targeting “low quality” EMDs. Not simply EMDs in general.
We’ve provided tons of coverage of the Panda update since Google first launched it. To learn more about it, feel free to peruse the Panda section of WebProNews.
Do you think Google has improved its search results with this algorithm combo? Is Google being transparent enough about algorithm updates for your taste? Let us know what you think in the comments.

Matt Cutts Just Announced A Google Algorithm Change

Labels: , , , 1 comments

says it will reduce low-quality “exact-match” domains in search results. It sounds like an extension of the last change he tweeted about, which was aimed at improving domain diversity. Here’s the new tweet:

Update: Cutts tweeted a follow-up:

Probably good of him to clear that up right away.
Google is about due to publish its big list of algorithm changes for the months of August and September. When that happens, it will be interesting to see how many entries are related to domains. It seems like there are typically visible themes in the lists. For example, in the June list, there were a lot of changes related to improving how Google deals with natural language.
Have you seen any effects from this update? Let us know with your comments.

Google News Gets A New Ranking Signal, And It’s A Keywords Meta Tag

Labels: , , , 0 comments

Google announced the news_keywords metatag for publishers in Google News to help Google better identify and understand content that is related to things that are in the news.
Do you think this is a good direction for Google News? Let us know what you think.
Here’s what it looks like:
<meta name=”news_keywords” content=”World Cup, Brazil 2014, Spain vs Netherlands, soccer, football”>
If you use it, use commas to separate phrases. You can add up to ten phrases per article, and each keyword is given equal value.
The company says it’s a way to empower writers to express stories freely, while helping Google News propertly understand and classify content. In a blog post, Google News product manager Rudy Galfi explains the thought process behind the feature:
The day after the historic 1929 stock market crash, Variety bannered their front page with these words: “WALL ST. LAYS AN EGG.” It’s a great headline: pithy, catchy, and expressive of the substance of the story as well as the scale of its consequences. It’s also worth noting that Variety’s editors had a full day to write the headline—millions of readers weren’t trying to search for the story within seconds of hearing about it.
The Web has transformed both how news organizations report information and the way users find it. Imagine if “WALL ST. LAYS AN EGG” were used as a headline today by an online news site. Since the headline is a sequence of text that’s only readily understandable by a human, most machine algorithms would probably attach some sort of biological association to it. In turn, this would make it difficult for millions of curious users who are using Google.com or Google News to find the best article about the stock market crash they just heard about.
With the news_keywords metatag, publishers can specify specific keywords that apply to news articles, basically like the classic keywords metatag.
The whole thing is pretty interesting, considering that Google has downplayed the regular keywords metatag. In fact, earlier this year, in a Webmaster Help video, Matt Cutts said, “You shouldn’t spend time on the meta keywords tag. We don’t use it. I’m not aware of any major search engine that uses it these days.”
Of course, this is a different tag, and it’s specifically news-related, though news results often appear in regular Google results. Cutts did say in a tweet:

Google is careful to note that the tag will be only “one signal among many” that its algorithms use to determine ranking.
“The news_keywords metatag is intended as a tool — but high-quality reporting and interesting news content remain the strongest ways to put your newsroom’s work in front of Google News users,” says Galfi.
Keep in mind, Google still frowns upon keyword stuffing (unless that’s going away in an upcoming version of its Webmaster Guidelines, which is highly doubtful).
In case you need a refresher, here’s Google’s quality guidelines for News:
News content. Sites included in Google News should offer timely reporting on matters that are important or interesting to our audience. We generally do not include how-to articles, advice columns, job postings, or strictly informational content such as weather forecasts and stock data.
We mean it — stick to the news! Google News is not a marketing service. We don’t want to send users to sites created primarily for promoting a product or organization.
Unique articles. Original reporting and honest attribution are longstanding journalistic values. (If your site publishes aggregated content, you will need to separate it from your original work, or restrict our access to those aggregated articles via your robots.txt file.)
Authority. Write what you know! The best news sites exhibit clear authority and expertise.
Accountability. Users tell us they value news sites with author biographies and clearly accessible contact information, such as physical and email addresses, and phone numbers.
User-friendly. Sites should load quickly and use URL redirects rarely. Clearly written articles with correct spelling and grammar also make for a much better user experience. Keep in mind that we can only include sites that follow the Webmaster Guidelines.

Google Launched An Update This Week To Improve Domain Diversity

Labels: , , , , 1 comments

Google launched an algorithm update that affects the diversity of search results. Google’s head of webspam and Distinguished Engineer, tweeted:

Matt Cutts
@mattcutts

Just fyi, we rolled out a small algo change this week that improves the diversity of search results in terms of different domains returned.

Reply · Retweet · Favorite
20 minutes ago via web · powered by @socialditto

There have been complaints in recent weeks about Google showing search results pages with a lot of results from the same domain for a lot of queries. Presumably that will be better now, and users will get a more diverse set of results in more cases. Or maybe it’s just about spreading the love among more domains in general (and not just per page).

That’s as much as we know about the update for now, but it’ll be interesting to see if the change is noticeable on a day to day basis.

There has been talk from webmasters that there may have been a new Panda update this week. We’ve not heard from Google on that front, and it’s unclear at this point whether this could have been the change people were noticing.

Google’s big list of algorithm changes for the month of August is due out any time now, and when it’s released, we’ll get more insight into the direction Google is going on, and its core areas of focus in recent weeks. Stay tuned.

Fear Of Google Ironically Has People Considering Making Natural Links Unnatural

Labels: , 0 comments

We recently published an article called “Links Are The Web’s Building Blocks, And Fear Of Google Has Them Crumbling“. This was about the panic Google has caused among webmasters with its messages about links. It’s a panic that has led to many webmasters requesting to have links removed from sites that they would otherwise find valuable, if not for fear that Google will not like them and hurt their rankings.

Is all of this fear over Google an overreaction, or is it justified? Let us know what you think.

I noticed a post in WebmasterWorld that expresses this point to perfectly. The title of the post says it all: “New link to my site worries me — but it’s a good link!” Senior member crobb305 writes:

Got an unsolicited citation from a media source but they used anchor text that I have been penalized on. FUD! Should I ask them to change it? By doing that I make a natural link unnatural, and Googlebot will detect that change (obvious tinkering). Nevertheless, I do have an OOP and received the infamous link warnings about 5 months ago.

I hate it that we have to live with this type of fear.

This person has been a member of WebmasterWorld since 2002, so they’ve clearly been in this world for quite a while. Yet here they are concerned that a completely natural link might draw negative attention from Google. The person is even wondering if they should go out of their way to make the link unnatural to please Google. How’s that for irony? Sadly, it’s highly likely that plenty of other webmasters are thinking similar thoughts.

As shared in the article mentioned at the beginning, there is plenty of overreaction from webmasters out there, and I would say that Google would rather see the link occur in its natural form, but this is the kind of fear people are dealing with to please Google and maintain some form of visibility in search results (which is getting harder and harder for other reasons entirely). Should people have to be this worried about links (the building blocks of the web)?

It probably doesn’t help that Google has reportedly indicated that forthcoming algorithm updates will be more “jarring.”

Another forum senior member later responded, “But seriously, some of the sites of mine that went down the Google drain were clean, ‘link building’ was not done, just attracted some real nice ones and yet the project died due to ‘penalties’. I went out of answers to this somewhere in the middle of 2011 and focus on cool stuff, HTML5, content (I think some tools can be considered good content) and ultimately ranking solid on Bing. Google does whatever Google wants to do.”

Likewise, Chris Lang from Gadget MVP tells me on Google+, “I never have worried about Google. I just do what seems natural. Never been slapped once…. At least not by Google.”

WebmasterWorld moderator goodroi tells the user, “One link from a quality, relevant website is not the problem. The hundreds of links with identical anchor text coming from blog spam, directory submission schemes and other short cuts are the problem.’

“I tend to focus more of my efforts on improving backlink profiles by adding quality links instead of focusing on deleting bad links,” goodroi adds. “Even if you delete every single bad link (and somehow are lucky not to accidentally delete a good link) you still need to build legitimate links. So if you start working on legitimate links you may end up getting enough good links that it naturally defuses the bad link issues.”

Unfortunately, many are seemingly still eager to kill significantly more links than they may really need to. On the flipside, even some publishers are growing leery of including guest content on their sites. This fear, apparently is coming from the Penguin update.

Barry Schwartz at Search Engine Roundtable points to a post from Cre8asite Forums, where user EGOL writes:

Since Penguin, I am getting a flood of article offers. Most of this content is crap. Some of it is “average” quality (which I don’t publish). Some can be excellent, unique, highly desirable. So now I am deciding if I want to accept some of this content, knowing that I could be publishing links to sites that could have past, present or future manipulation.

I have a potential article that I really like and that would be very popular with my visitors. The author’s site ranks #1 in a difficult niche and they don’t have enough content on their site to hold that position from editorial links (IMO).

I have not seen any articles or discussion about the cautions that a publisher should be following in these days of post-penguin linking.

So, not only are people afraid to have links out there that they would find valuable, if not for fear of Google, but some are also afraid to publish quality content, for fear that it might somehow be connected to something Google will not like. Ironically, quality content is what Google wants from sites above all else.

Are webmasters worrying about Google too much, or are these simply rational concerns, with Google being such a dominant force on the web? Comments please...

Removal Requests Actually Down, Following Google Algorithm Change

Labels: 0 comments

On August 10, Google announced that it would be updating its algorithm the following week to include a new ranking signal for the number of “valid copyright removal notices” it receives for a given site.
“Sites with high numbers of removal notices may appear lower in our results,” said Google SVP, Engineering, Amit Singhal, at the time. “This ranking change should help users find legitimate, quality sources of content more easily—whether it’s a song previewed on NPR’s music website, a TV show on Hulu or new music streamed from Spotify.”
One might have expected the removal request floodgates to have been opened upon this news, but that does not appear to be the case. In fact, interestingly, it has been kind of the opposite, according to Google’s Transparency Report.
Barry Schwartz at Search Engine Roundtable points out that from August 13 to August 20, the number of URLs requested to be removed from Google search per week, actually decreased, going from 1,496,220 to 1,427369. It’s only a slight decrease, but the fact that it decreased at all, following this news, is noteworthy.
URLs requested to be removed
August 20 is the latest date Google has data available for, so we’ll see what the following week looked like soon enough. As you can see from the graph, the number has been trending upward, and has jumped quite significantly over the course of this summer.
For the past month, Google says 5,680,830 URLs have been requested to be removed from 31,677 domains by 1,833 and 1,372 reporting organizations. The top copyright owners in the past month have been Froytal Services, RIAA member companies, Microsoft, NBCUniversal and BPI. The top specified domains have been filestube.com, torrenthound.com, isohunt.com, downloads.nl and filesonicsearch.com.

 
Internet Marketing Expert, SEO Latest Google Updates - Naveen Kumar © 2012 | Designed by Meingames and Bubble shooter