Why is this site being deindexed?



From my limited look just based off the info in the thread I think you have htaccess problems with multiple url's. Could be a problem with a bamboo eating yet cuddly animal.
Make the SSL only on those pages when someone is signing up, paying for something, etc.
Looks like you have multiple url's with the same content.
That could be it, but if it was me I would PM Grindstone.
 
Word. Thanks for the input guys.
This site is not a client, it was just brought to my attention and peaked my interest due to the strange nature.

It looks like most of the site is reindexed again. I personally think it was just a weird Google fluke, but I guess we will never know. Most things brought up here will help the owner either way.

Cheers
 
It looks like most of the site is reindexed again. I personally think it was just a weird Google fluke, but I guess we will never know. Most things brought up here will save the owner from becoming another algorithmic stat since he could get nailed by Panda at anytime in the future.

Cheers

Fixed it for you.
 
lol, don't be naive.

He's right though.

Ryan nailed, that thing is panda fucked for sure. Welcome to the unnamed rollout 3+ weeks ago. Strong actions against big sites with lots of dupe/low quality content and over optimized on site metrics. Throw some D's (content) on that bitch's.
 
He's right though.

Ryan nailed, that thing is panda fucked for sure. Welcome to the unnamed rollout 3+ weeks ago. Strong actions against big sites with lots of dupe/low quality content and over optimized on site metrics. Throw some D's (content) on that bitch's.

How does Homeadvisor, angieslist, and the lot get away with how they return search results? Surely they have crap loads of useless pages?
 
^^^^ Gotta be that big brand bias, no. Be interesting to see if any of them are directly mentioned in the review docs. Either way, over optimizing and then de-optimizing is a painful but effective way to shake it if it's Panda. Takes about a month on huge sites (500k + indexed), probably much quicker on smaller ones.
 
it may be a weee risky but i am liking 'wp indexer' from sam (wpsnowball 2) but more for fun toy sites (suprcloakr, serpremacy and such of going hand in hand).

Not explicitly BH on its own, but i might not use on a domain not ok with losing - but no one's silly enough to have domain's like that are they :-) i hear the internet might be going away in 10 or 1 years! the whole internet! going the way or grindstone's avatar! hurry up! make and break a website!

UNAGI++++++++++++++++++++
 
^^^^ Gotta be that big brand bias, no. Be interesting to see if any of them are directly mentioned in the review docs. Either way, over optimizing and then de-optimizing is a painful but effective way to shake it if it's Panda. Takes about a month on huge sites (500k + indexed), probably much quicker on smaller ones.
It's almost like you and I are dealing with that right now or something......
 
How does Homeadvisor, angieslist, and the lot get away with how they return search results? Surely they have crap loads of useless pages?

Useless content isnt the issue. Panda can assess a site's overall quality based on a page by page analysis of a lot of factors, but the only metric Google has for determining "usefulness" is bounce rate.

Useless pages arent the issue. Duplicate content, content elements without semantic variational relationships, site speed and coding factors and the like are the big issues, as is the bounce rate. The site in the OP pretty much violates everything.

We dont know the bounce rate of those brands... or more importantly the individual pages. I dont buy into "big brand bias" outside of any Vince update argument. Ccarter, I believe, has a post in the Enlightened section on doing things brands do.

Dont focus on a site's useless pages. You dont know what their useless pages are, and you cannot tell unless you have access to their analytics. Uselessness is not an argument while doing a competitive analysis, because you are injecting your own bias, thus skewing actual data.
 
Useless content isnt the issue. Panda can assess a site's overall quality based on a page by page analysis of a lot of factors, but the only metric Google has for determining "usefulness" is bounce rate.

Useless pages arent the issue. Duplicate content, content elements without semantic variational relationships, site speed and coding factors and the like are the big issues, as is the bounce rate. The site in the OP pretty much violates everything.

We dont know the bounce rate of those brands... or more importantly the individual pages. I dont buy into "big brand bias" outside of any Vince update argument. Ccarter, I believe, has a post in the Enlightened section on doing things brands do.

Dont focus on a site's useless pages. You dont know what their useless pages are, and you cannot tell unless you have access to their analytics. Uselessness is not an argument while doing a competitive analysis, because you are injecting your own bias, thus skewing actual data.

I get the jist of what your saying , but in Angies case they deliver results that may or may not be there as the real results hide behind a log in screen filled with articles like this Best West Boylston, MA Siding Contractors | Angie's List

HomeAdvisor formally Service magic does similar things, delivering useless content when they dont have a listing for that location.

I would imagine delivering the same articles per category for every location in the USA would tend to be duplicate content no? I mean how many times do they show the same page with only the location that is different? Thousands if not millions.
 
BWORPe1.png


They'll get away with SERP murder if they'd like to.
 
I get the jist of what your saying , but in Angies case they deliver results that may or may not be there as the real results hide behind a log in screen filled with articles like this Best West Boylston, MA Siding Contractors | Angie's List

I would imagine delivering the same articles per category for every location in the USA would tend to be duplicate content no? I mean how many times do they show the same page with only the location that is different? Thousands if not millions.

They serve that same LP for each location. I get that. But they did something really smart. They added unique reviews and a list of contractors from that location at the bottom of those pages. So each location has enough unique content to make each location neat. The articles themselves are not contained within the locations so they can link to one canonical version of each article.

Again, you are judging what you think is useful, algorithms are trying to judge quality and relevance, not usefulness (other than bounce rates).

Dont try to make useful pages. Try to make quality pages that are relevant to the query.