Im getting alot better at keyword research, i can usually tell roughly based on a sites inlinks and content how hard they are to beat. Eg ill look at how many keyword anchored links and from what type of sites those links are coming from, aswell as the overall content on the site.
Now this is fine when its the domain that is listed in the serp, however when its a individual page on one of these supersites i have no idea how to gauge it. Ie one #1 position im looking at has one of those price comparison sites (as most serps do now). It has under a 100 links and is only PR3. However the main domain has a couple of thousand links and PR5, so obviously i cant do any kind of link analysis on the main domain.
In trying to work it out i was thinking about the black hat viagra guys. Ie when they do parasite hosting they are mainly using the domains authority to rank, however they usually still have to get a couple thousand links to rank? (ones ive seen)
All advice appreciated.
Now this is fine when its the domain that is listed in the serp, however when its a individual page on one of these supersites i have no idea how to gauge it. Ie one #1 position im looking at has one of those price comparison sites (as most serps do now). It has under a 100 links and is only PR3. However the main domain has a couple of thousand links and PR5, so obviously i cant do any kind of link analysis on the main domain.
In trying to work it out i was thinking about the black hat viagra guys. Ie when they do parasite hosting they are mainly using the domains authority to rank, however they usually still have to get a couple thousand links to rank? (ones ive seen)
All advice appreciated.