I’ve been using SEMRush a lot more lately. I need some backlinks. Bing delisted me and I think it might have something to do with that warning in Bing Webmaster that I don’t have enough high quality backlinks.
One method I landed on was looking at others in the niche. Throw their domain into SEMRush. What is their authority? How much traffic do they get? How many backlinks do they have.
Oh. And this is what I really like… who is linking to them?
So I see who is linking to them and I throw those domains into SEMRush. How high is their authority? Is this a high quality link?
If so, maybe I should check out their website and see if I can figure out how my competitor got their link from them. Maybe I can get one.
So I’ve been doing this by hand a bit. I haven’t put too much into it. I get distracted looking at the amazing sites I end up at.
But that isn’t a bad thing either. Essentially, I am immersing myself in the landscape. It’s good recon. But it can be time consuming.
The personal touch is definitely called for here, but maybe there are some parts of this task that can be sped up. I was thinking it would be nice if I could write some Python to do the SEMRush lookups.
With a little research it looks like, yes indeed. SEMRush does have an API.
Let’s see if we can get that working tonight. It’s already late but let’s give it a shot.
The first thing to know is that it is not free. API access does not come with the Pro & Guru plans, but does come with Business & Enterprise plans.
Ok. Let’s check what plan I have.
Ugh. I’m only paying $139.95/month for whatever my plan is. It doesn’t even seem to have a name. It is listed as “Your Plan.” Ok.
Well I can see in the SEMRush > Dashboard > Subscription info that in order to step up to the Business Plan it would cost me $499.95/mo, or $995.45 per year.
I have to say, I like the savings on that annual plan, but it is still too rich for my blood at the moment.
Maybe this is a good time for me to consider SEMRush’s competition Ahrefs, which I have not yet tried. Hmmm. According to their website, the price for their API is out of reach as well.
Maybe I’ll have to automate the front-end with pytest and selenium or something like that. But I don’t think that is happening tonight.
There’s probably other parts of this endeavor that might be worth automating as well.
Like, a scraper that tries to find certain parts of these websites. Maybe it could look for a contact email, a blog roll, social media and anything else I find myself always looking for.
It would also be nice if there were a system keeping track of who I contacted and adding their backlinks from SEMRush to the back of the queue.
I actually don’t want to go too nuts on this. I don’t want to spam the whole world with link requests. Also, I do want to take the time to read these sites and understand them a little.
I think this should be a sweet spot of automation and the human touch. Right now it is all human. It will probably stay mostly human for a while.
Leave a Reply