Google throws a wrench in Market Samurai's business:



So is MS rank tracker the only issue thing that's getting fucked? Will their KW tool and SEO comp still work? That's all I care about

They said the SEO comp was gonna start getting pulled from Bing, I think. I read it quickly, not carefully.

They must not. Although not sure how their service would work otherwise, teh Googles will get up all in your shit.

They do use proxies. But who knows how many. It's a just a tick box to say, yes, use their built in proxies. They get 10 zillion queries per second I bet. People probably load up every single keyword for every single inner page and click "go" and walk away for an hour and come back to see rankings. They were talking about it taking "30 minutes" to update rank. How many damn keywords are these guys talking about, you know.
 
Last update:

Code:
[B]Affected services[/B]

SEO Competition:
Google PageRank (Temporarily unvailable)

Rank Tracker:
Google PageRank (Temporarily unavailable)


[B]Updated services[/B]

Keyword Research:
SEOC (Now powered by Bing search API)
SEOLC (Now powered by Bing search API)

SEO Competition:
Initial SERPS check (Available)
Index Count (Available)

Rank Tracker:
Index Count (Available)
Google rank data (Available with service limitations)
 
They said the SEO comp was gonna start getting pulled from Bing, I think. I read it quickly, not carefully.



They do use proxies. But who knows how many. It's a just a tick box to say, yes, use their built in proxies. They get 10 zillion queries per second I bet. People probably load up every single keyword for every single inner page and click "go" and walk away for an hour and come back to see rankings. They were talking about it taking "30 minutes" to update rank. How many damn keywords are these guys talking about, you know.

sure, but with a proper queue system and rate limits, you shouldn't ever burn out a proxy. The queue might take a while sometimes and sometimes goes really quickly, but at least you don't run into instances where the entire system breaks down as they're dealing with right now.

You can do a lot with a surprisingly small amount of good proxies if you just architect around it properly. I've read most of their code for Market Samurai, it's very impressive Javascript. However, what they're explaining seems like they just didn't plan correctly. They should have pushed an update that limits the max amount of keywords you can track the rankings of, or forced users to get their own proxies, etc. There's a dozen ways they could have prevented this or at least planned around it.
 
sure, but with a proper queue system and rate limits, you shouldn't ever burn out a proxy. The queue might take a while sometimes and sometimes goes really quickly, but at least you don't run into instances where the entire system breaks down as they're dealing with right now.

You can do a lot with a surprisingly small amount of good proxies if you just architect around it properly. I've read most of their code for Market Samurai, it's very impressive Javascript. However, what they're explaining seems like they just didn't plan correctly. They should have pushed an update that limits the max amount of keywords you can track the rankings of, or forced users to get their own proxies, etc. There's a dozen ways they could have prevented this or at least planned around it.

Agreed. They should at least give users the ability to query google directly and use our own proxies. The cloud service was nice, but it seems like its days are done. That and keyword/query limits should solve the problem.
 
In general, considering what MS has gone thru with this google disaster, I think their PR response has been pretty legit, with the exception of this statement after explaining that many services will have to switch to Bing since Google no longer works:

As you know, Bing has made some big improvements in recent years and there are many who are saying that Bing’s algorithm produces more relevant results than Google for many keywords.

L O L
 
Why couldn't they just have the software do the queries from the users end? It's not like the user base for this type of software has never heard of a proxy, or wouldn't have use for one.

The only reasons I could possibly see them doing it this way would either be some custom service they subscribe to for google, or to prevent people from reverse engineering the software.
 
sure, but with a proper queue system and rate limits, you shouldn't ever burn out a proxy. The queue might take a while sometimes and sometimes goes really quickly, but at least you don't run into instances where the entire system breaks down as they're dealing with right now.

Maybe I'm just thinking of the old days, but didn't goog have a per day per IP limit? What kind of rate limits do they have these days?
 
Seems kind of retarded to build a multi-million dollar business depending on Google's good-will without a back-up plan. (lol bing)
 
Any ultimate niche finder users want to weigh in on this? The software isn't that expensive, but I would just as soon not buy it if its going to die before I can really get started with it.
 
Any ultimate niche finder users want to weigh in on this? The software isn't that expensive, but I would just as soon not buy it if its going to die before I can really get started with it.

I bought it last week and it's working just fine for me, using it right now as I browse forums for gay webmasters. Anytime Google tweaks something they usually have a patch out within 12 or so hours....
 
Maybe I'm just thinking of the old days, but didn't goog have a per day per IP limit? What kind of rate limits do they have these days?

I try not to think of it like that. The way I do my queues is a randomized delay per request that is a handful of seconds, and then if I get a 503 or something, I'll put that proxy on ice for 15+ minutes to make sure it doesn't get hellbanned.

Works nicely, and with the kind of money they have, they can buy a shit ton of proxies and be fine.
 
I try not to think of it like that. The way I do my queues is a randomized delay per request that is a handful of seconds, and then if I get a 503 or something, I'll put that proxy on ice for 15+ minutes to make sure it doesn't get hellbanned.

Works nicely, and with the kind of money they have, they can buy a shit ton of proxies and be fine.

Makes perfect sense. I'd probably also go as far as adding in random other requests and switching out old/new cookies occasionally.

Maybe this might interest you (or the MSM guys) but have you considered hooking into MS and proxying requests over to a competing service? Seems like a pretty easy way to clean up on MS's failure.
 
Makes perfect sense. I'd probably also go as far as adding in random other requests and switching out old/new cookies occasionally.

Maybe this might interest you (or the MSM guys) but have you considered hooking into MS and proxying requests over to a competing service? Seems like a pretty easy way to clean up on MS's failure.

each request made by my bots is from a fresh Mechanize instant with no previous cookies and different user agents and stuff. It's a bunch of brand new requests at all times, there's no paper trail.

And I'm sure this goes the same for MSM as it does me, but rather than improve MS' software, we just built the tools we always wanted and opened them up to customers because we were tired of frustration with inferior products. MSM launched at a perfect time in my opinion given the state of affairs in the rank tracking world...