After reading a few topics in this forum and doing a little research, i would like to konw if there is more to guiding serchbots and getting search-engine friendly than this, please…
– use a meta tag to have the engines not index the index.php page but follow it’s links, and to do index the individual posts pages (using an “if” for that);
– use WP’s custom permalinks generator to make looks-like-a-directory permalinks (does that really help the bots?);
– using the meta tag for description and keywords;
– use the h1 tag to “attract” the bots attention.
Is that all there is to it?
Google has about 75% of the market so its best to code for Google.
The fastest way for Google to find your site is have a link (or links) from other site(s) that are already in Google. You can get indexed in less than 2 weeks this way. Site submission sites and dmoz.org are slower (months) . Google does use the http://www.dmoz.org directory and it is an important factor in Google rankings.
High rankings on Google are a combination of focussed page content and content-relevant links. The latest Google algorithm favours larger ‘authority sites’ on a theme with many inbound links.
Don’t get links just for links sake, links should be relevant to the content of the site.
Google doesn’t pay much attention at all to metatags as they can be ‘faked’
Google trys to determine the theme of the page so for higher rankings each page should focus on 2-3 keywords or phrases that are repeated in the Title, main tags (H1, H2) and in the content.
Google loves blogs – focussed and fresh content and lots of links.
Er, that’s all I can think of for now.
To find out what pages Google has indexed on a particular site go to Google and type: site:www.mygreatdomainname.org
It helps you see what types of blog links Google can follow.
This is a really interesting question. It has been posted before and not one single person responded at all. That might tell us something about the level of interest in this area. I have done some high powered paid consultancy in this area for banks, online bookshops and the like but it was a long time ago and I am not completely current. We have all got three strategic design choices at the outset:
build for one engine only, build generically with a cross engine site, or to build a range of front pages for different engines. How we make that choice then determines what we do next. My own thinking currently as I am just getting back on line again is to go with semantic, clean construction, but to maximise the search engine signature at the outset. I am thinking of about five keywords in the title, a very positive H1, duplicating the H1 in a twenty five word first paragraph, and having a key word population of four per cent of content. Then I am going to watch, measure, and adjust. Google has in my professional opinion, blown up, and it may be unwise to focus on it exclusively. Good luck guys. I am into this.
I’m the Anon who said “its best to code for Google.”
Google still has the market share and if you build for it you’re likely going to be ok if MS or someone else takes a run at Google.
Yes Google has ‘blown up’ and had some erratic results lately as they try to improve their alogithms and respond to blog glut and Google bombing. I think it’s only temporary and they will remain the dominant SE.
More notes – pro and con:
Currently no search engine can read CSS. So if you have something bolded or bigger in CSS Google doesn’t see it. Cleverly WordPress puts headings into H1 etc… tags that Google can read.
Google ranks info higher on the page as more important.
Google likes to see the same page that vistor’s see. Google doesn’t like CSS especially floating DIVs as visually users may see ‘the content’ high the page but via CSS magic its actually at the bottom. Not much Search engines can do about that at the moment but give them a year or two.
Blogs do well on search engines but SE’s are trying to find algorithms that de-rank personal blogs so that searches on “New York restaraunts” aren’t dominated by “what I ate for lunch today ” blog musings.
Google likes nice clean simple designs with high content ratios but blogs tend to have a higher crud/content (post) ratio than standard html sites. By crud I mean all the html markups, calendars, friends links, rss, powered by, use the Firefox browser stuff that is typical of blogs.
I’m currently testing a new WP site to see how Google handles it. So far its only indexed the front page but even new non-blog sites have that complaint so I’ll wait and see.
You might well believe Google will remain the dominant SE. You may be right.
But your advice is very unsound. The Google rankings are all over the place.
Furthermore there is a very serious question as to whether it is even possible to write code to handle more than six billion pages at all. I said and I stand by it “it may be unwise to focus on it exclusively”. The day MS takes a run at anything to do with the internet remotely successfully far less running a SE, I will eat my hat.But there are plenty of other important engines out there.
The observations as to floating divs are palpable nonsense.
OT: Would it really be that much trouble to register? I’m having the hardest time remembering who said what to whom…
Palpable nonsense? Is that a nice word for B.S.? Them’s fighting words pardner. :o)
Google wants to see the same page that your visitor sees (and another reason not to make different pages for different engines). If your visitor sees content low on the page Google would also like to see it low. There are spammy uses for layering and obviously many more legitimate uses and Google doesn’t penalize for it but one day the engines will be able to read CSS and the SERPs will reflect it. At the moment though its not something to worry about and it never will be unless you’re dependent on rankings for revenues.
You can’t go wrong optinizing for Google and if you want to load up metatags for other engines it can only help.
But enough of this B er nonsense…here are Google’s guidelines in their own words…
Wow, thanks for the input. I will just keep listening, this has got up to professional level.
Google likes incoming links from good sites with good PR.
If you can manage that, no meta tag is worth what it is supposed to be…
Well there is nothing wrong with being at a “professional” level. Many home site builders are better all round than many guys selling web services:-) Just to summarise. IMHO content is king. That means relevant text right up front containing our keywords. Semantic mark up plus accessibilty make it extremely desireable (paramount) to get our content on the page before the menus. However this can be very very difficult to do if you also want all the bells and whistles in your layout, even in two columns.
In fact it may be impossible. The critical thing to remember is that Google; thank heavens; can not read your css. This can be abused. It is not my thing, but there are obvious advantages to that state of affairs. Personally I struggle enough keeping up with what the engines are doing now without worrying about the future too much:-) As to Google and it’s well known links policy. Those guys are clever. And then some. Who has a lot of links? Bloggers. It has however now rendered Google almost useless for serious searches for static sites. Google may not mind that but other engines are gaining ground and the suits are moving already. Final thought for a beginner.
Work out what words you would like your page to rank highly for. Lets say:
- The topic ‘Search engines round-up’ is closed to new replies.