DONZYWAP
Google crawls every site, but crawls the
index
(the homepage) faster. With this secret
your forums and
downloads will be indexed on
Google faster and better.
1. Submitting Your Website
-First Register a Gmail account
incase you don't have.
-Visit
http://google.com/webmasters/tools/verification
- Add your website.
- Verify your site.
Using Meta Code (on
wapka) because this is the easiest way to
verify wapka sites in google webmasters
-Paste the code given to you on
your head tag via
>Edit Site
>Global Settings
>Head Tags.
You can also paste your Title and site
Description there. (meta tags). EXAMPLE
you can add more descriptions and tags by separating each with comma(,)[/code]
Then, your
site can now be crawled. Note: It takes at
least 24hrs for your site to
appear on Google.
2. Submitting your sitemap:
By submitting your sitemap,
Google now takes all shortcuts
to your sites without crawling
all your pages before it filters unnecessary/
poor pages.
With the sitemap, you can
set restriction on pages, forums
e.t.c.
Procedure
- Add your sitemap via
> Edit Site
> Global Settings
> Head Tags
> Sitemap.xml (add your site and forum ids
separated by commas e.g f123456897, f456789412, 21, 62, 53, 44, 52,
f62434, f21434, f5656)
- Then go to
http://google.com/
webmasters
- Submit the sitemap with this
URL
http://
yoursite.wapka.mobi/
sitemap.xml
(e.g [link=http://donshalchywap.wapka.mobi/
sitemap.xml]http://donshalchywap.wapka.mobi/
sitemap.xml[/link])
-Then you are good to go. I have gone
through most
Wapka sites and I noticed most
of them cannot be indexed on
the Yahoo Search Engine.
WHY?
It's quite simple,The reason is from the
default Robot.txt file of wapka sites.
Before every good crawler/spider crawls a
site,it must first of all go to the Robot.txt
and check the area it must and must not
crawl,though some spam bots and bad
crawl don't obey robots.txt.
Now,the default wapka Robot.txt goes like
this:
Code: [Select]
User-agent: Slurp
Disallow: / User-agent: *
Disallow:
Crawl-delay: 60
COPY CODE BELOW
Let me analyze them:
User agent means the name of
the crawler, slurp is yahoo,
googlebot is google and so on.
User agent: * means all the
spiders but user agent: slurp or user agent:
googlebot specifies
the particular spider you are
referring to and
Disallow: / means that the
crawler should not touch or
crawl any of your site page.
Disallow: means the spider is free to access
all your page.
User Agent: Slurp
Disallow: /
simply means that Yahoo Bot should not
touch your site and that's why Wapka sites
do not appear in Yahoo Search. If you want
your site to be
crawled and be visible on all
Search Engine put this in your
Robot.txt file via
Edit Site >
Global Settings
Robots file (robots.txt)
We're done!
Hope you enjoyed the whole tutorial? Don't
forget to say a big thank you and drop your
comments to encourage us. As for your
site,don't worry because,we'll help you to
put it on Google and other search engines.
Yes i just said that and i mean every bit of
it. Just keep visiting and reading our
tutorials as well as applying them and you'll
soon be on google search results!