In the next two posts I’ll compare the two and outline some of the pros and cons of each, so you can best determine which method is appropriate for you and our niche.
The Views Per Page Method
Firstly, let’s look at the views per page method. This is an excellent method to use for a consistent approximation of your potential site traffic. The main premise used in this method is that any page can rank for some term or set of terms, so simply by having a large volume of pages you can have consistent traffic.
Pros of the Views Per Page Method
There are several main advantages to this approach. The first is that is provides an extremely fast way to evaluate traffic, and is something that you can do on the back of a napkin in just a few seconds.
“If I publish X posts per month and receive Y visits per post, by the end of month 12 I can have ___ visits.”
Straightforward and simple, I like it.
The other main advantage is that this method is what I refer to as a “smooth” estimate, which means it focuses on long term averages rather than short term events. No one, specific page matters outside the context of the whole. For a medium or large affiliate site, this is a good strategy.
Cons of the Views Per Page Method
However, there are also a number of reasons not to use the views per page method.
Firstly, this method completely ignores a niche’s popularity. It’s entirely possible to write hundreds or even thousands of pages on super-small niches, but that doesn’t change the fact that only a few thousand people may search for it in a given month. This means that the fundamental logic may be flawed if the niche isn’t big enough.
Secondly, this method also ignores the competition present within a niche. Even given a requisite number of monthly searches, some niches are much harder to crack into than others. The ‘make money’ subject or ‘forex trading,’ for example, are both very competitive niches that a small site is unlikely to rank well for, in which case the traffic assumptions break down very quickly.
Finally, this method does not really consider the quality of a post, and seems to promote a larger number of lower-quality pages. Think about it this way: if your argument is that the raw number of pages is the primary factor in determining traffic, then you’re encouraged to produce as many pages as possible, so long as they meet some minimum quality standard.
Contrast this to a model that focuses on fewer pages of extremely high quality content, with other pages simply acting as pointers to the primary material, and you can begin to see the problem with the approach.