apparently the market samurai guys have never used a proxy before
They must not. Although not sure how their service would work otherwise, teh Googles will get up all in your shit.
apparently the market samurai guys have never used a proxy before
They must not. Although not sure how their service would work otherwise, teh Googles will get up all in your shit.
So is MS rank tracker the only issue thing that's getting fucked? Will their KW tool and SEO comp still work? That's all I care about
They must not. Although not sure how their service would work otherwise, teh Googles will get up all in your shit.
[B]Affected services[/B]
SEO Competition:
Google PageRank (Temporarily unvailable)
Rank Tracker:
Google PageRank (Temporarily unavailable)
[B]Updated services[/B]
Keyword Research:
SEOC (Now powered by Bing search API)
SEOLC (Now powered by Bing search API)
SEO Competition:
Initial SERPS check (Available)
Index Count (Available)
Rank Tracker:
Index Count (Available)
Google rank data (Available with service limitations)
They said the SEO comp was gonna start getting pulled from Bing, I think. I read it quickly, not carefully.
They do use proxies. But who knows how many. It's a just a tick box to say, yes, use their built in proxies. They get 10 zillion queries per second I bet. People probably load up every single keyword for every single inner page and click "go" and walk away for an hour and come back to see rankings. They were talking about it taking "30 minutes" to update rank. How many damn keywords are these guys talking about, you know.
sure, but with a proper queue system and rate limits, you shouldn't ever burn out a proxy. The queue might take a while sometimes and sometimes goes really quickly, but at least you don't run into instances where the entire system breaks down as they're dealing with right now.
You can do a lot with a surprisingly small amount of good proxies if you just architect around it properly. I've read most of their code for Market Samurai, it's very impressive Javascript. However, what they're explaining seems like they just didn't plan correctly. They should have pushed an update that limits the max amount of keywords you can track the rankings of, or forced users to get their own proxies, etc. There's a dozen ways they could have prevented this or at least planned around it.
As you know, Bing has made some big improvements in recent years and there are many who are saying that Bing’s algorithm produces more relevant results than Google for many keywords.
sure, but with a proper queue system and rate limits, you shouldn't ever burn out a proxy. The queue might take a while sometimes and sometimes goes really quickly, but at least you don't run into instances where the entire system breaks down as they're dealing with right now.
Maybe I'm just thinking of the old days, but didn't goog have a per day per IP limit? What kind of rate limits do they have these days?
they are way more advanced than per day per IP limit
Any ultimate niche finder users want to weigh in on this? The software isn't that expensive, but I would just as soon not buy it if its going to die before I can really get started with it.
Maybe I'm just thinking of the old days, but didn't goog have a per day per IP limit? What kind of rate limits do they have these days?
I try not to think of it like that. The way I do my queues is a randomized delay per request that is a handful of seconds, and then if I get a 503 or something, I'll put that proxy on ice for 15+ minutes to make sure it doesn't get hellbanned.
Works nicely, and with the kind of money they have, they can buy a shit ton of proxies and be fine.
Makes perfect sense. I'd probably also go as far as adding in random other requests and switching out old/new cookies occasionally.
Maybe this might interest you (or the MSM guys) but have you considered hooking into MS and proxying requests over to a competing service? Seems like a pretty easy way to clean up on MS's failure.