June 2019

By: 
Tom
 | 
Updated: 
July 1, 2019

Whooooooo - summer is now here! 🙂

June was an incredible and memorable month.

The Toronto Raptors took home the NBA Championship for the first time ever!!

It's also been 1 year since I began my journey into passive income entrepreneurship.

Stay tuned as I'm planning to release a 1-year reflection post soon (with video!).

In this post, I'll review this month's results and share my current approach to link building.

Thanks in advance for reading. I really appreciate it!

This Month's Results

Serge Ibaka GIF

Key Metrics

  • Affiliate Revenue: $167 (+69% MoM)
  • 30-Day Unique Page Views: 6,046 (+23% MoM)
  • # of Articles (Live): 82 (+11 MoM)
  • # of Links (Live): 6 (+1 MoM)

Commentary

Revenue came in at $160 for the month (of which $29 was non-Amazon). I was pleased to see this and am working on improve existing articles to boost conversion rates.

Traffic growth, however, struggled as rankings for target keywords either plateaued or declined. I'm hoping that more links to my site plus some on-page optimization will fix this problem.

In addition, I lost the featured snippet for a high volume keyword from last month 🙁

In short: nothing is permanent in SEO!

This month, I completed my last few "Best X for Y" articles from this batch.

For the rest of the month, I was busy building a scalable and efficient link building process.

As a reminder, links are an indicator of "popularity", which influences keyword rankings.

I'm following the "shotgun skyscraper" a.k.a. "spray and pray" approach. This method is recommended by Authority Hacker in their beginner course The Authority Site System.

While I have been skeptical of this approach in the past (due to lack of success), I wanted to give it another shot by running truly massive email campaigns (with 1,000's of prospects).

First, I want to credit Jason Malone from this podcast for inspiring most of my process.

Now without further ado, let's dive into my process and the tools that I'm using.

My Step-By-Step Process For Link Building

The basics of "shotgun skyscraper" is this:

  • Email a lot of people with a generic email template
  • Follow-up with the people who reply, typically by showing them an awesome article on your site and asking politely for a link

According to Jason Malone, the "sweet spot" is sending 300-400 emails per day.

To get there, we need a massive list of relevant prospects (i.e. other sites) for each campaign.

So here's how we do this.

1. Build A List of Keywords

First, we need to build a list of keywords to help us find relevant sites in step #2.

For example, my authority site (The Derm Detective) is about acne scars, so a keyword could be "how to get rid of acne scars". Typing this into Google will return hundreds of relevant sites.

I use a keyword research tool like Ahrefs to find thousands of related keywords starting with a seed keyword. I generally set a minimum volume of 10 to remove really long-tail keywords.

Repeat this process with a few seed keywords, eliminate duplicates, and voila - you now have thousands of relevant keywords!

Before step #2, I like to manually check my keyword list to remove any irrelevant keywords. I also use a keyword "blacklist" to automatically remove negative keywords that I know from experience will result in irrelevant or inappropriate sites.

2. Pull URLs With ScrapeBox

ScrapeBox

This is where the magic happens!

Now that you have a list of keywords, we're going to use a software tool called ScrapeBox to (you guessed it) scrape results from Google and Bing.

ScrapeBox has many features but the one we care about is its search engine harvester function.

Basically, ScrapeBox will query the search engines and pull URLs from the results pages. If you start with a list of 1,000+ keywords, you could end up with a list of 100,000 URLs by the end!

Of course, this includes a lot of duplicate URLs so maybe 10,000 or so uniques, but still a LOT!

In terms of getting started, I highly recommend checking out Loopline's videos on YouTube. He has many tutorials including one on setting up Storm Proxies with ScrapeBox.

Now, you can't just go scraping Google results willy-nilly. If you did, your IP address would quickly get banned from using Google (usually just temporarily).

So, we need to use a proxy service that lets ScrapeBox fetch Google results via proxies (i.e. other IP addresses) in order to successfully pull our list of URLs.

The best way to do this is with Storm Proxies as they provide back connect proxies (also known as rotating or reverse proxies). Essentially, they let you access their pool of 70,000 proxies and each HTTP request by ScrapeBox uses a different proxy which means more successful pulls!

Alright, even Geek speak for one post!

Once you've bought Storm Proxies and setup ScrapeBox properly (see Loopline's videos), just press GO and let it run. It can take several hours to overnight to finish pulling the results.

3. Filter URLs With ScrapeBox

Now that you have a massive list of URLs, we have another problem, Houston.

You'll likely find a bunch of "junk" results or simply irrelevant ones in your list.

If you combed your keyword list earlier and ran ScrapeBox with a conservative URLs per keyword setting (less than 200), then you shouldn't have too many irrelevant results, but it does happen.

So the next step is to filter out the undesired URLs. Thankfully, ScrapeBox can help us do this.

First, you should remove duplicate URLs (literally the first thing I do every time).

Next, you can remove duplicate domains (when the same site shows up more than once). I prefer to do this only after applying my filters.

Now, we have a list of unique URLs. The problem, though, is sorting out the junk in an efficient manner since we want to create a scalable link building process.

The solution is to use blacklists.

Essentially, I build a series of blacklists for keywords, domains, and any patterns in the URL structure that may indicate a site is not high quality or relevant for my purposes.

For example, here are the blacklists (and whitelists) I'm using:

  • Domain Extension Whitelist: keep only domains with .com, .net, .ca, etc.
  • Domain Extension Blacklist: remove domains with .org, .gov, .eu, etc.
  • Keyword Blacklist: remove URLs with particular phrases such as "/download" (likely a download page), "/shop" (likely an ecommerce site), "/2011" (likely a really old blog post)
  • Domain Blacklist: some URLs can't be removed based on a keyword blacklist as the URL structure doesn't tell you what the site is about. For these, I just save them in a domain blacklist so they'll automatically be removed from future lists
  • On-Page Keyword Blacklist: I'm still experimenting with this one right now. ScrapeBox has an add-on feature called "Page Scanner" that lets you categorize URLs based on words or phrases found in the HTML file (the text of the page plus additional page elements). This is really powerful as it can "read" the content of webpages and semi-replace the manual check done by a human. It can also be used to find particular pages that share a common footprint; for example, I could pull URLs that include the text "Powered by WordPress" to find WordPress blogs. However, I find that this returns a lot of false positives when used to eliminate URLs, particularly with shorter words. For example, when I tried to remove webpages containing the word "sex", it flagged a bunch of relevant sites. As you can imagine, the problem is this word is used in several different connotations and is also embedded in longer words (as well as random HTML strings). "Casino", on the other hand, was much more accurate at removing only the undesired sites.

Note that the filtering stage is just that: filtering.

Don't try to remove every single irrelevant URL as this is really not efficient. It's better to look for patterns and intelligent ways to remove hundreds of URLs each time so that with every iteration, your filters will get better and better so you don't need to manually check the list as much.

Finally, when you're satisifed with your "clean" list of URLs, we can move on to finding emails.

4. Find Emails Using Hunter

This step is fairly straightforward. I'm using Hunter.io to pull emails using my list of URLs.

I haven't tested many services like Hunter but I hear that it has very good data quality. It also lets you pull emails with only a domain name whereas other places require a first and last name.

Upload your list to Hunter and use its bulk "Domain Search" tool. Then, you can choose how many emails you want to download, sorted by Hunter's estimate of email deliverability. I tend to choose the highest or second highest option as long as the overall deliverability is above 90%.

Basically, if you download an email that has poor deliverability (i.e. more likely to bounce), you are wasting 2 credits - one to download the email and then another one to verify the email, only to find that it doesn't exist or has been deactivated.

There's one additional mini-step here. If there's more than one email attached to a site, Hunter will provide all of the emails and it's up to you to decide which one to contact.

This is quite a manual task for a list of hundreds or thousands of emails. Frankly, I haven't yet concocted a formula to intelligentally select the best email. But I've made an email blacklist to remove irrelevant emails like "support@", "shipping@", "billling@", etc.

5. Check Email Deliverability Using Hunter

Now that we have our clean list of emails, there is one more step before we can start sending.

It's extremely important to verify your email list to weed out any high risk emails. Essentially, if you send too many emails that bounce (not delivered), email providers may start to "block" or "throttle" your email deliverability as they will believe your emails are spam.

An email verifier service runs a number of "checks" in the back-end, like pinging the SMTP mail server to see if an email address is deliverable. This way, you can reduce your overall bounce rate and avoid getting flagged as a spam sender.

I currently use Hunter for this step as well ($0.01 per email based on the PRO plan) but there are other options like NeverBounce (a bit cheaper at $0.008).

6. Craft an Email Campaign

Now we can actually create our email campaign!

I like this comment from Gael at Authority Hacker, who tells us that the point of your initial email is simply to find people who are "repliers". Even the best emails will be ignored if the other person is too busy, not interested, or simply doesn't reply to their emails.

So my current email copy is real simple:

Hi {{first_name:"there"}},

Is this the right email to contact the editor of your site?

If not, would you kindly direct me to the appropriate person?

Many thanks,
Tom

As you can see, I'm using a merge tag for first name if it's available. Otherwise it'll say "Hi there".

I'm also using Hunter to manage my email outreach. I used to use MailShake for this but since Hunter is essentially free to use (I already pay for their other services), it was a no-brainer.

MailShake offers many more features (better design and A/B testing), but for what I'm after, it's fine to use Hunter for the time being.

7. (Finally) Reply to Emails

At last, we're at the part where we "pitch" our prospects to hopefully earn ourselves a link or two.

Once I receive a reply, I'll spend 10-15 minutes on their site to understand what they're all about. Mainly, I'm looking for the right "angle" - i.e. why are you contacting them?

It's worthwhile to spend this time on each prospect because they're more likely to continue the conversation compared to pitching a cold prospect (i.e. someone who hasn't replied yet).

It's also important to use the right jargon in your email. For example, if it's a skincare clinic, I'll refer to patients or clients. If it's a blogger, I'll talk about readers or visitors. If it's an ecommerce store, I'll mention customers or shoppers, etc.

While our goal is to ultimately obtain a link, I think it's better to demonstrate some value first.

This could be sharing your well-researched article and explaining how it benefits their audience, providing some feedback on their site, or (if you're really advanced) pointing out technical issues on their site like broken links or 404 pages.

Overall, this step is more art than science. It requires a human touch and superb communication skills to convince total strangers on the Internet to give you a link.

8. Refine The Process

The very last step is to refine the overall process. As Mark from Authority Hacker mentions, those who continually improve the efficiency of their processes are the ones who ultimately excel.

This includes:

  • Scraping: experiment with different settings on ScrapeBox
  • Filtering: remove more irrelevant results to save time on manual reviews
  • Sending Emails: keep an eye on deliverability
  • Pitching: test different pitches to see which ones are most effective
  • Replying: test different responses/strategies to convince a total stranger to give you a link
  • Tracking: keep a neat record of results for analysis and to avoid duplicate efforts

Next Steps

Overall, June was another encouraging month that marked the 1-year anniversary of my journey.

Throughout this process, I've had so many thoughts, ideas, and reflections which I'll share with you in a separate post for my 1-year review!

At this time, my plan is to grow my authority site to $1K per month in revenue by the end of 2019.

To Flexibility and Freedom,

Tom

1.5K views
Share via
Copy link