Basics of Search Engine Optimisation

Basics of Search Engine Optimisation

What is a Search Engine

Search Engines like Google, Yahoo, Bing, MSN have a large database of their own. They store the data of websites by using spiders. These are not real spiders in fact these are the virtual spiders who crawls through the world wide web and collects data.

Millions of searches are conducted every day on search. Search topics have no range and theses search terms are what we designate as “Keywords”. There are millions of keywords but among them only few are the bosses which we have to monitor. We have to master the keywords that are searched most and relevant to our site topic. So how do you attract people searching for what your site has to offer? This techniques is called search engine marketing (SEM).

This is a step by step guide that will explain you from basics to advanced about SEM..

First What is a Search Engine?

Search engines in real means a data encyclopedia, which has all the required information for its viewers. For anything you search you will get the directions to move. These directions are website URL’s. These URL’s have the data that the viewer was searching for. So search Engine is a kind of mediator between viewer and website. It always has a simple box or search box where you enter the words that you are looking for and it gets all the relevant search engine result pages (SERPs). To generate SERPs the search engine compared your search phrase with information it has about various web sites and pages, selects it from its huge database and displays it to the user. Now there are so many results and how it orders these results. This is another good question. We will find the answer to it below.

How Search Engines ranks a Website for particular keyword?

This is an interesting topic for a layman as well as for SEO giants. This is where real competition starts between websites. This is an advanced topic so you can follow this link to jump to that page. Else you can continue.

Search Engine Classes

Targeted audience, number of visitors, quality of search and professionalism is what determines a search engine's class. Each search engine typically target specific audiences based on interest and location. World-class search engines look very professional, include virtually the entire web in their database, and return highly relevant search results quickly.

Most of us are familiar with the major general search engines; google.com, yahoo.com, msn.com. A general search engine includes all types of websites and as such are targeting a general audience. There are also the lesser known 2nd tier general search engines; zeal.com,ask.com,whatyouseek.com. The primary difference is that 2nd tier engines are lesser known and generate significantly less traffic.

There are also several non-general or targeted search engines that limit the types of websites they include in their database. Targeted search engines typically limit by location or by industry / content type or both. Most large metro areas will have local search engines that list local businesses and other sites of interest to people in that area. Some are general and some are industry specific, such as specifically listing restaurants or art galleries.

Many other targeted search engines list sites from any location but only if they contain specific types of content. Most webmasters are familiar with webmaster tools search engines such as; webmasterworld.com, hotscripts.com, flashkit.com and more. There are niche SEs for practically any industry and interest.

Search Engine Models

There are two fundamentally different types of search engine back ends: site directories and spidering search engines. Site directory databases are built by a person manually inputting data about websites. Most directories include a site's url, title, and description in their database. Some directories include more information, such as keywords, owner's name, visitor rankings and so on. Some directories will allow you to control your website's information yourself others rely on editors that write the information to conform to the directory standards.

It is important to note that most directories include directory listings as an alterative to the search box for finding websites. A directory listing uses hierarchal groupings from general to specific to categorize a site.

Spidering search engines take a very different approach. They automate the updating of information in their database by using robots to continually read web pages. A search engine robot/spider/crawler acts much like a web browser, except that instead of a human looking at the web pages, the robot parses the page and adds the page's content it's database.

Many of the larger search engines will have both a directory and spidering search engine, e.g. yahoo.com, google.com, and allow visitors to select which they want to search. Note that many search engines do not have their own search technology and are contracting services from elsewhere. For example, Google's spider SE is their own, but their directory is and Open Directory; additionally aol.com and netscape.com both use Google's spider SE for their results.

There are a few other search engine models of interest. There are some search engines that combine results from other engines such as dogpile.com and mamma.com. There are also search engines that add extra information to searches such as Amazon's alexa.com, which uses Google's backend but adds data from its search bar regarding tracking traffic to the site.

Regular Updation of Listing

One of the most important things to understand about the SE database models is how to get into their database and keep your listing updated. With a search directory, a submission needs to be done to provide the directory all the information needed for the listing. It is generally recommended that this be done by hand, either by you or a person familiar with directory submissions. There are many submission tools available that advertise they automate the submission process. This may be fine for smaller directories but for the major directories, manual submissions are worth the time.

Not all search directories are free; many charge a one-time or annual fee for review. Many of the free search directories have little quality control. For free directories you may have to submit your site several times before being accepted.

There are three different methods for getting into spidering search engines; free site submission, paid inclusion and links from other sites. Virtually all spidering SEs offer a free site submission. For most, you simply enter your url into a form and submit. Paid inclusion is normally not difficult, except for the credit card payment. For free site submission there is no quality control. The SE may send a spider to your site in the next few weeks, months or never. Typically with paid inclusion you will get a guarantee that the page you submitted will be included within a short amount of time. The other standard way to get included is to have links to your website from other web pages that are already in the SEs database. The SE spiders are always crawling the web and will eventually follow those links to find your site.

Once you are in a search engine database, you might change your site and need the search engine to update their database. Each directory handles this differently; generally each database will have a form for you to submit a change request. Spidering search engines will eventually find the change and add your updates automatically.

Getting High Rankings

Getting into a search engine database is only the first step. Without other factors you will not rank in the top positions, a prerequisite for quality traffic. So how do you get top positions? You can pay for placement with sponsored links that is covered in the next section. To place well in the free, organic SERPs, you will need to perform search engine optimization.

Search engine optimization is one of the most complicated aspects of web development. Each search engine uses a different algorithm, using hundreds of factors, that they are constantly changing, and they carefully guard their algorithm as trade secrets. Thus no one outside of the search engines employ knows with 100% certainty the perfect way to optimize a site. However, many individuals called search engine optimizers have studied the art and derived set of techniques that have a track record for success.

In general, there are two areas to focus on for top rankings; on-page factors and linking. On-page factors mean placing your target keywords in the content of your site in the right places. The structure of and technologies used on your website also play a role in on-page factors. Linking, refers to how other website's link to yours and how your site links internally.

Search Engine's Marketing Offerings

Search engines in the early days of the web were focused solely on serving the visiting searcher. They worked to capture as much of the web as possible in their database and provide fast, relevant searches. Many early website owners learned to reverse engineer the relevancy algorithms and to make their sites "search engine friendly" to get top rankings. They were the first search engine optimizers, manipulating the search engine's natural or organic SERPs as a means of generating free web traffic.

Often times these optimized sites compromised the integrity of the SERPs and lowered the quality for the searcher. Search engines fought, and continue to fight, to maintain the quality of their results. Eventually, the search engines embraced the fact that they are an important means for marketing websites. Today most search engines offer an array of tools to balance website's owners need to market while maintaining quality for the searcher.

You can generally break search engine marketing tools into free and for-pay. Realize these classifications are from the search engine's point of view. Effort and expense is required to setup and maintain any search engine marketing campaign.

Organic rankings are still one of the most important ways to drive quality traffic. Search engines now seek to reward ethical, high-quality websites with top rankings and remove inappropriate "spam" websites. While organic rankings can produce continual free traffic, it takes time from an experienced individual to achieve optimum results. Additionally, organic placement offers no guarantees, it generally takes months to get listed and can be unpredictable once listed.

Some search engines offer services that add more control to your organic campaign. Most of these services will list / update your site faster or will guarantee that all essential content is listed. For integrity reasons, no major search engine offers higher organic rankings for a fee.

If you need top rankings quickly, pay-per-positioning (PPP) is the most popular way to go. PPP rankings appear in normal organic SERPs but are usually designated as "sponsored listings". PPP listings use a bidding process to rank sites. If you are the top bidder, e.g. willing to pay the most per click on a given phrase, you will have top placement. The 2nd highest bidder is two; the next is 3 and so on. While most PPP works using this model, some search engines offer modifications such as Google's AdWords where bid price and click-through rates are both factors for positioning.

Search Engines have many other marketing tools, such as search specific banner ads; listings on affiliate sites and more.

Getting Started

The majority of websites have sub-optimal search engine marketing. Most sites have no effective search engine marketing and are continually missing out on valuable leads. Many other websites are too aggressive, wasting money on low value traffic or harming the functionality of their site due to over optimization. Too many sites are even paying money and receiving no results because they have trusted unethical or inexperienced search engine optimizers.

All SEM campaigns should start with a strategic evaluation of SEM opportunities based on return on investment (ROI). You need to assess how much each lead is worth for each keyword phrase and determine which SEM tools will achieve the best ROI for the phrase.

You also have to decide how much you want to do in-house vs. retaining an expert. A qualified expert will typically produce better results faster, but the high expenses may destroy the ROI. Often it is best to work with an expert as a team, the expert to develop the strategy and internal staff to perform implementation and ongoing management.

 

Members Login
Top

eskisehirescortu.com escortlariankara.com ankaradaescortlar.net escortelitankara.net