This just in, Moz has stopped reporting the Domain Authority and Page Authority of popular Web 2.0 sites commonly used for SEO purposes. This comes as somewhat of a surprise considering many people use Tumblr, Weebly, and others for legitimate purposes.
Last week I ran into an issue with the Moz API not returning results in the Scrapebox Bulk PA Checker tool. After some research I discovered that Moz had banned some IP addresses associated with this tool. The fix was simple, I just used a dedicated proxy to hide the IP of my VPS, problem solved.
Moz is clearly against people using their API excessively (and rightfully so) but I had no idea they were so against people using Web 2.0s for SEO. I feel that Moz, as a data provider, should provide data to their users irregardless of what the data is going to be used for. This is akin to McDonalds refusing to serve a fat person their beloved Big Mac for health reasons! We want our Big Macs!!… err, I mean our metrics for Web 2.0 sites.
Tumblr: PA1 DA1
Weebly: PA1 DA1
Over-Blog: PA1 DA1
A Tumblr blog that I own which has excellent metrics:
The same Tumblr blog as above now entered into Majestic:
This is really a humdinger. SEO Khan depends heavily on the Moz API to find our customers the highest quality expired Web 2.0 properties. We also use the Majestic Bulk Backlinks tool to find high TF web 2.0 blogs but that is more of a manual process and takes considerably more time.
Citation Flow can be used as a backup to to Page Authority however it seems to be a much harder metric to influence. Search Engine Land defines Citation Flow as “number of predicting how influential a URL might be based on how many sites link to it” which is a similar description to Page Authority.
I can tell you anecdotally that I see hundreds of expired Tumblr blogs that read something like: CF11 TF0, PA0, RD29. I would normally pass on this domain because everyone wants high PA Tumblrs, not high CF Tumblrs, even though CF seems to be a better metric to measure the strength of backlinks pointing to a page.
I personally feel that Citation Flow is a better metric because I see a lot of PA28/CF0 domains that have 1-2 referring domains (RD) and also and a lot of PA0/CF12 domains that have 20+ RD. It seems that Citation Flow is better at measuring the strength of links pointing to a particular domain.
I have purchased high PA Tumblr blogs from sellers on Fiverr and Source Market to test them and I was shocked at how ineffective they were. People often get hung up on metrics and they assume that just because a domain has PA30, it’s going to have a ton of good links pointing to it. It’s kind of like the person who is just getting started building PBNs and they register a TF25 domain that only has one backlink. They see the TF25 and get all starry-eyed and forget about everything else. This happens all to often in the web 2.0 world.
If people start demanding expired web 2.0 domains that have high Citation Flow it will completely disrupt the Fiverr and Source Market sellers because they will be forced to purchase expensive Majestic accounts to get around the Fair Use limitations.
Regular Joes who are building expired Web 2.0 PBNs by hand will have a harder time because they will only be able to check a few domains every 24 hours.
Of course these people could just purchase a SEO Group Buy since they are lite users.
I might be jumping the gun a little bit here. Maybe Moz is simply updating their database for Weebly, Tumblr, Over-Blog, etc. but I think that’s highly unlikely. Why would all other websites register a PA and DA while only Web 2.0 sites (commonly used for SEO) fail to show metrics.
I messaged Moz via their support system and also Tweeted at them for an answer. I guess we will see what they say.
SEO Khan Customers: Do not worry! We have stockpiles of Web 2.0’s on deck. We built our bunker and we can weather the storm.
Well, we have some answers. It looks like Moz was attempting to remove all Web 2.0 sites that have over 10,000 sub domains. Apparently nobody thought remove Tumblr, Weebly, Quora, etc from this list. Seems like a pretty dumb move on somebody’s part.
Here’s where Moz answered us on Twitter:
That being said, Rand doesn’t provide any sort of time frame for bringing the data back. Is this just a way to buy Moz some time before the announce that they will NOT be bringing back reporting for Web 2.0s that have over 10,000 subdomains?
Rand Fishkin: “We’ll see what we can do.” “Hopefully have a fix soon or revert to our previous index.” – This is quite possibly Moz putting us off and letting the initial shock wear off, only to announce later that they WILL NOT be bringing back our favorite Web 2.0s. Only time will tell.
Nothing new to report. Moz has not fixed the issue nor have I heard any new info.
I personally think that Moz has no intention of bringing these sites back to their index. They know that many people use their API to aggressively check the metrics of expired Web 2.0 domains. They also know that expired Web 2.0 domains are commonly used for SEO purposes. I would venture to say they also know how many of their paying users are managing sites hosted on these large domains. By purging all domains that have over 10,000 subdomains they are reducing the load on their API caused by people who are using bulk PA/DA checking tools. *removes tin-foil hat*
Moz made some changes to their index but this issue is still not corrected. Moz is now showing the PA and DA for the root domain and not the sub domain. This is really a kick in stomach not only to SEOs who use Web 2.0 sites as part of their strategy, but also legitimate small business owners and bloggers who have free websites hosted on Tumblr, Weebly, etc.
C’mon Moz! Please fix this stuff. I’m burning through analysis units on Majestic as I now have to use Citation Flow to measure the strength of incoming links to Web 2.0 sub domains.
I will continue to update this post as new info becomes available.
I just remembered that some of our Moz Buddies in the comments section indicated that this would be resolved on June 21. As much as I wanted to believe them, at the time I felt they were just doing damage control.
Well, as it turns out, they were being truthful. Heres the new results:
I am very grateful to the Moz team for fixing this issue and being honest in their comments. I’ll admit, I was very skeptical and I truly believed that Moz decided to remove Web 2.0s from their index to cut down on spam. I’ll admit when I’m wrong and it looks like my tin foil hat was a little too tight.
Thanks Moz! The Web 2.0 world has returned to normal and all is well in the kingdom once again.