FORGOT YOUR DETAILS?

How to Rank Your Site – Notes from a Science Experiment

by / Thursday, 14 June 2012 / Published in News

Ranking your website.

Music to your ears, no? We’re all painfully aware that the search engine algorithm updates show no signs of stopping. If anything, SEO will only get harder. It’s tougher to rank now than at any point in the history of the Internet, and many small-time webmasters have speculated that there may come an era in which large companies with deep pockets will be the only ones who can compete.

I’m a writer by trade, but I’ve been known to dabble a little in the webmaster world myself. However, I’ve stayed away from building my niche concepts out into full-fledged websites because frankly, I was really put off by the shifty SEO tactics that many Internet marketers employed in order to rank.

The Research

Let me level with you here. I used to write for a few of those aforesaid shifty marketers. People who wanted keyword-stuffed articles dripping with spam-ready links at every line break. People who planned to spin my carefully crafted content into dozens of unreadable articles and then blast that spammy content all over the Web. Of course, when I was new to the online writing scene, I just wrote what I was told. I had no way of knowing how my clients would use my content or why I had to adhere to such weird guidelines. I just wrote.

As I learned more about the ‘net, I realized that I was playing a (minute) part in polluting the Web. It’s funny; as soon as the algorithm changes started making serious headway, those kinds of clients scattered like roaches. Luckily for me, I had stopped writing for those bozos long before that happened.

The Hypothesis

So I started thinking. If great content is the name of the game and SEO is supposedly getting harder, then one could logically deduce that a crackerjack writer with a working knowledge of SEO could easily rank a site.

Agreed? Great.

Based on this theory, I decided to conduct a little science experiment of my own.

The Experiment

I’d like to think that I’m up-to-date on the dos and don’ts of webmastering these days, so I stuck to the Site Reference plan and constructed my site accordingly. Here’s what I did, step by step:
First and possibly most importantly, I picked a topic that I’m passionate about. It’s a niche that I’m already a part of, so it’s one for which I could produce endless content – and it probably helped that I’m always up on my niche’s newest trends and news.
I chose a short, brandable domain name that’s not an exact match keyword. It’s a word that is related to my niche, however, and it’s easy for users to remember and type directly into their browser address bar.
I made sure that my domain name was aged. Now, this is a hotly-debated issue; one that many SEOs have strong views about. All the fuss can be traced back to an old Google patent that went a little something like this:

The debate has stemmed from Google’s wording in the patent, and for that reason, it’s been discussed to death for years on many of the top SEO blogs. For instance, here’s a little snippet from a Whiteboard Friday video made by Rand Fishkin over at SEOmoz:

Now, correct me if I’m wrong, but what I ascertained from this passage is that domain age only matters a teensy smidge, but the backlink profile and authority (or lack thereof) that a site may have had in it’s past life matters a heck of a lot more. In my site’s case, both were good – not stellar, but good enough to serve as a great foundation for a new site. The moral of this bullet point? If you decide to go with an aged domain for your new site, do your homework before you buy. Check the domain’s old backlinks and general community standing in order to get a feel for how your site will be received by the search engines.
I selected a slick template that’s as easy to navigate as it is on the eyes.
I spent the extra money on a nice logo and header.
I planned my site’s navigation ahead of time. I used logical categorization, and I put myself in the user’s shoes when I thought the site out. I used broad categories for the biggest subjects in my niche, and then I made lists of post ideas that would fall within each of those categories so I would have a balanced array of subjects that I knew visitors would expect to see when they arrived at my site.
I sat down and spent a lot of time writing well thought out and carefully researched posts. We’re not talking about stuff you could get from spending top dollar at The Content Authority. No – I mean each and every post is a feature-worthy piece. Pictures, screenshots, deep research, the whole ball of wax.

After my base content was added, I decided to do a bit of keyword research. I chose long tail keywords that I felt users would search for, and I plugged them into Google’s free keyword tool. Most marketers direct people to go for the keywords that have lots of local exact match searches – as in 1000 or more at a minimum. I decided to go against the grain, so I picked a handful of long-tail strings that each had less than 100.

Why? My reasoning is that ranking for each of these keywords will be much easier, and lots of posts with bits of traffic trickling in will eventually add up to gobs of traffic if I can exercise patience and consistently add more great content.

For the keyword posts, I kept it simple. After all, the point of this experiment is not to get bogged down in metrics. I didn’t use any fancy software. Just my trusty Google Keyword Tool cross-referenced with a peek at the top ten sites in the SERPs for the keyword phrases I chose.

Now, can a little old writer rank a site with no link blasts, no forum marketing, and no crazy SEO data? Well, the site is too new to say for sure – but I have a hunch I’m on the right track. The preliminary findings are (for me) pretty exciting.

The Results

As I’ve already mentioned, the site is still very new. It’s only been around for a little over two weeks now, but the domain name very old. Once I started adding the content, Google indexed my site almost immediately. Then, when I wrote my first keyword article as a test, I used the long-tail keyword very sparingly. I placed it in the URL, the title, in one of my headings, and once in the text. I didn’t bother with anything else – I just wrote a great article with lots of good information.

I was flabbergasted when I checked the SERPs two days later. I was expecting to find my article buried pages-deep – if it was even ranking at all for my chosen keyword string.

But there it was. Lucky number 14 – right there on page two.

Here’s the kicker: I paused while writing this article to check my ranking for that keyword once more. Hold onto your keyboards, everyone. My article is now on page one – and it’s number three on the list.

The Next Steps

This experiment of mine is nothing you can’t replicate yourself. I’ve laid out everything I did, start to finish. There are still things that I can do to improve my site, however. First, I can perform an SEO audit to clean up my site. This is a bit on the technical end, but it will be well worth the trouble.

I should also submit a sitemap to Google. And before I hear about it in the comments, trust me, it’s on the “to-do immediately” list. I plan to keep adding content and building it out, eventually ranking for keywords with greater and greater search volume as my website’s brand begins to gain trust in Google’s eyes.

This experiment went better than I could have ever hoped. My site has miles to go, but I think I’m off to a running start.

I’d like to hear from you now. Tell me about how you’re ranking.

Have you experienced similar results with your newer sites? Are you planning any mad science experiments of your own post-Penguin?

Source: http://site-reference.com/articles/how-to-rank-your-site-notes-from-a-science-experiment/ 

TOP