So you built a new website or started a new blog on WordPress or Blogger …
Nobody is visiting your site or blog and you’ve spent hours making it great and it seems like nobody cares !!! A few weeks go by and still nothing… no visitors!
I know, you feel lost! You have Googled “web traffic” and got a gazzilion sites selling you traffic or SEO Services or “FREE Website Traffic” in exchange for xyz. Everybody wants something, right?
Well hopefully this will help get your site / blog going:
There are a few things you absolutely must do:
Before we begin, if you’re not using Firefox as your browser, get it now and forget all the others: www.mozilla.com/firefox/
Ok let’s start 🙂
1) Get a Google account. If you don’t have one, get one here ->www.google.com/webmasters/tools/
2) Add your site / blog to your webmaster account and follow the steps provided by Google.
3) Create a sitemap of your site. If you don’t have a sitemap tool then here is a great free one written in Java: http://www.auditmypc.com/xml-sitemap.asp
4) Add your sitemap to your Google webmaster account for your site.
5) To get indexed quickly by Google, get yourself a SpicyPage account here: http://spicypage.com and add your site. Please vote for LinkChili.com too 😉
6) Get yourself some backlinks. You can get your first one from Linkchili.com (this is a good starting point, especially if you do not already have any links. Be aware that generally linkfarms cannot provide quality links. By “quality links” I mean links that are contextually relevant to the content of your site and have a minimum “Page Rank” (PR) of 4)
Ok, now some stuff about your site.
Have you set your title tags?
Selecting a Title Tag
This lets the search engines know what your site/blog is all about. A title tag is the first thing a search engine will see and it will be judged upon the relevancy it has to the content of your website. It is highly recommended to ensure that your most important keywords are located in the title tag to increase the relevancy between the site and the title.
A title tag example for this Article would be: <title>Web Traffic for the New Website or Blog Owner</title>
What do your Keywords look like?
Monitoring your Keyword Density
The keywords you select should all be relevant to the content of your site. Your choice of keywords is critical, so be sure there is a DIRECT relationship between these words and your sites purpose.
A keyword example for this Article would be: <meta name=”keywords” content=”Web traffic, New Blog, New Site, Traffic, SEO, Organic Search, Site Owner, Blog Owner, Google>
Close attention should also be paid to the way the words are used within the sites content as well as the frequency of their use. Be sure that your selected keywords are used naturally, as would be the case in a casual conversation, and take note NOT to use them too frequently.
Overuse could get your site penalized by the search engines thereby lowering your search engine rankings. If the need to frequently use some words (due to their relevance) arises, have a list of synonyms available for substitution.
A keyword density of between 2 and 5 percent is a suggestion margin to target, to both accurately reflect your sites content and keep you in ‘good graces’ with search engines.
Wow, that seems like a lot of work! Well there are some free tools that LinkChili.com provide to help you with this task. You will need to register, for free, on LinkChili.com, then login to your profile. Select “Member SEO Tools” and select “Keyword Density Analyser”. Enter your site’s URL (or the web page you want to check) and away you go!
You are making progress! Now you are ready to get some traffic! Remember Google loves content! The more you give it, the more it will come and visit your site, resulting in more frequent crawling and higher indexing. This increases your chances for organic search results.
Organic search results are direct targeted free traffic from a search engine such as Google. Targeted traffic is traffic (people) that have specifically searched for something your site offers. Organic search results do not involve PPC (Pay Per Click) or non-targeted popunder or popup paid traffic, that generate unwanted hits or spam hits.
Next you will need to let “AboutUs” know about your site or blog: So register here: http://www.aboutus.org/. This will help you get “spidered” more quickly.
Have you ever heard of Alexa? Alexa measures your “traffic reach” to internet users. You need to tell Alexa about your site or blog too: http://www.alexa.com/siteowners. Don’t be despondent when in a week’s time you’re still unranked or have a ranking of 30000000 this is normal. It will decrease over time. Your goal is to get your Alexa rating as low as possible and your page rank as high as possible.
Remember you changed your browser to Firefox? Well now you can install the Alexa Toolbar, it is called Alexa Sparky and you can get it here: https://addons.mozilla.org/en-US/firefox/addon/5362
Alexa Sparky accompanies you as you surf, providing useful information about the sites you visit without interrupting your web browsing. It also increases the surf trend of your site or blog as you work on it, which in turn reduces your Alexa Rank. 🙂
Right, so there is method to all this madness! Now, to get your Alexa rating down you need visitors as your site ages. So you need to start getting some traffic fast. I don’t really like these sites but they work well initially. This site has automatic surfing where u earn credits and in return they display your website to other members who will open your site if it suites them. The traffic is not huge I got around 40 visitors a day. It’s good enough to start off with. Register for a free account here:www.thousandsofhits.com. Now you should be in the position where your site is getting some traffic and your Alexa Rank is coming down – as you can see in your Alexa Sparky tool bar.
All the information provided in this article is explained in full detail on LinkChili.com with tons of great features for members only, membership is free.
Good Luck! All the best with your site / blog may it grow and grow and grow.
SEO services address the inherent mechanism of the world’s most popular online search tool, the search engine. Access to the secrets of a search engine assists companies to improve the search rankings of their websites, helping boost traffic and attract relevant traffic. Quality traffic can significantly improve a company’s conversion rates, fueling sales and profitability. With Internet surfers using search engines as their primary online information source, SEO services can help companies tap into a large portion of their potential market.
SEO Services: What to Include?
Whenever one thinks of SEO services for a website, the only service that comes to mind is researching for keywords that are appropriate for the business. These keywords are the words and phrases used by web surfers to search for a product or service and act as the first step to attracting traffic. However, SEO services encompass much more than simply researching for the appropriate keywords. Whenever an organization hires an SEO company, it can make use of any of the following services to improve rankings on search engines and enhance visibility, popularity, sales and growth:Site review and modification: These services involve reviewing and refining content, website architecture and backend site coding in order to make every page more compatible with search engines.
RSS Feeds: Search engines such as Google prefer RSS feeds. Creating an RSS feed, with the targeted keyword in the title of the feed, and submitting them to as many RSS aggregators as possible would significantly help improve the rankings of a website.
Blogs: Blogs, especially posted at Blogger, Squidoo and WordPress, are given preference by search engine like Google. Creating blogs, the content of which is focused on target keywords, and posting them on popular blog sites can have a highly favorable impact on the rankings of a website. These blogs can also be converted into RSS feeds and submitted to make the maximum impact.
Incoming links: Incoming links can be created through active forums and Yahoo Answers. Posting keyword-targeted articles on active forums and asking questions on Yahoo Answers, while focusing on target keywords, would significantly boost the popularity of a website. Each article posted in various forums must have a link that can redirect readers to the website.
More Articles at www.linkchili.com
Answer by: George Kaloyanov, Aplus.Net Knowledge Base Support
The heart of the Google search engine is the PageRank™, a system for ranking web pages developed by Google’s founders Larry Page and Sergey Brin at Stanford University. PageRank relies on the uniquely democratic nature of the web by using its vast link structure as an indicator of an individual page’s value. In essence, Google interprets a link from page A to page B as a vote, by page A, for page B. But, Google looks at more than the sheer volume of votes, or links a page receives; it also analyzes the page that casts the vote. Votes cast by pages that are themselves “important” weigh more heavily and help to make other pages “important.”
Important, high-quality sites receive a higher PageRank, which Google remembers each time it conducts a search. Of course, important pages mean nothing to you if they don’t match your query. So, Google combines PageRank with sophisticated text-matching techniques to find pages that are both important and relevant to your search. Google goes far beyond the number of times a term appears on a page and examines all aspects of the page’s content (and the content of the pages linking to it) to determine if it’s a good match for your query.
With the Google Rankings tools you can check the most important Google measurments for your website:
- Saturation: The number of indexed pages by Google for your entire website.
- Back Links: The number of indexed pages of external websites which have links to your own website.
- Google PageRank: A function of Google that measures the quality of a website on a scale of 0 to 10 based on various parameters including the “Saturation” and the “Back Links”.
You can also compare your website’s score with the competition and track changes by generating history reports.
More Articles at www.linkchili.com
The best way to get you informed about SEO is to do a little ‘Q’ and ‘A’. We’ll start with some basics in case you are just starting out, and we’ll end with how to put your knowledge to use.So what is Free Search Engine Optimization anyway? • It is simply the process by which you get your website ranked, or listed, high in the “Organic” search results for certain keywords.
What are the Organic search results? • They are the non-paid, or non-sponsored, listings that are on each page of the search engine results and are located under the sponsored listings on the left.
What are ‘Keywords’ and how do they affect me? • These are the words that people will type into the search engine to find your website. For example if you are a dog groomer in Deland you would have keywords like ‘dog groomer Deland’, ‘Deland, FL dog groomer’, ‘pet grooming Deland’, etc.
How do I get ‘ranked’ for my keywords? • The main secret to SEO is creating links to your website. You can either buy links or create your own. You can also get placed in the sponsored results through ‘Pay Per Click’. And there are still others, SEO companies, who aggressively promise Number 1 rankings by doing Who Knows What starting at $200 per month. In the essence of Saving Money, I prefer creating my own links, as it is the only Free method and also gives you complete control over the process.
What is pay-per-click (PPC)? • You can pay the search engine (Google, Yahoo, etc.) for placing your Website in the search engine search results. You actually pay them every time someone clicks on your link. If you have a good knowledge of converting ‘guests’ into ‘leads’ then this may be a good option for you. If you don’t you could end up with a large bill at the end of the month and no new business to show for it! (I have never needed to use it myself, but I’ve heard that the price you pay per click can increase over the course of the month without you even knowing it; just something to look in to if you decide to research this method.)
Does PPC affect organic rankings? • No. They neither help nor hurt your ‘natural’ search engine rankings.
So where do I begin ‘organic’ SEO? • The first thing you need is a good website with lead capture features because you can drive the entire population of your continent to your site and if they are confused by it or you don’t give them a good reason to either buy or leave their information, you will be out of luck.
More Articles at www.linkchili.com
Every online business owner should be aware of a few simple seo tips that they themselves can implement to increase their search engine ranking. Often times in haste to improve any off or online business people will rush to hire an (overpriced) consultant.
Without question search engine optimization is important for any person or entity wanting to increase their ranking with the search engines. Of equal importance however is developing a basic understanding of how seo works and why.
By quickly introducing outside consultants to address this need for us we tend to sidestep this learning process ourselves. This ends up costing us both a valuable education and money. Let’s have a look at a few basic seo techniques we ourselves can implement that should improve both our search engine ranking and our budget.
SELECTING A DOMAIN NAME
As trite as this step may seem to many in the process of setting up their online business it is really a critical step. The name selected should, if possible, reflect what the site or business is about. As a rule the more specific you can be the better chances you’ll have to get your site indexed quicker. Also it is wise to keep your domain as short as possible.
The temptation to get too ‘cute’ with a catchy name should be avoided at this stage unless it is directly related to your site, business, or perhaps one of your products.
SELECTING A TITLE TAG
This lets the search engines know what your site is about. This is the first thing a search engine will see and it will be judged upon the relevancy it has to the content of your website. It is highly recommended to insure your most important keywords are located in the title tag to increase the relevancy between the site and title.
MONITOR YOUR KEYWORD DENSITY
The keywords you select should all be relevant to the content of your site itself. Your choice of keywords is critical so be sure there is a DIRECT relationship between these words and your sites intent.
Close attention should also be paid to the way the words are used within the site content and the frequency of their use. Be sure your selected words are used naturally as would be the case in a casual conversation and take note NOT to use them too frequently.
Overuse could get your penalized by the search engines thereby lowering your search engine rankings. If the need to frequently use some words due to their relevance arises, have a list of synonyms available for substitution.
A keyword density of between 2 and 5 percent is a suggestion margin to target to both accurately reflect your sites content and keep you in ‘good graces’ with search engines.
Links that come from external sites reflect to search engines how popular, relevant, or important your site is. The more links you have (within reason – don’t go overboard) the higher regard search engines tend to have for your site. This usually results in a higher ranking for your site.
When leaving a link back to your site at other sites you’re visiting it is recommended to use keywords from your site in the anchor text.
These simple seo tips we’ve discussed here today should greatly improve the optimization of your site if this techniques haven’t already been implemented. Paying attention to the content of your site and how you use your keywords will definitely enhance your standings in any search engine rankings.
More Articles at www.linkchili.com
Keywords. Keywords. Keywords. I swear, that’s all I hear from my colleagues, that and the assumption that keywords are the most important part of SEO. Well, I disagree. I don’t believe keywords are the most important building block of SEO. Keyword phrases mean nothing unless they’re used in a specific, contextual environment.
Think about it. Many search engine spammers use software spiders to “borrow” content from competitor sites to generate doorway-page content. Do spam doorway pages generate high-quality search engine traffic and conversions? Not likely, especially if you’ve ever viewed some of these poorly designed and unusable pages. The contextual environment is substandard; therefore, the keyword phrases have little meaning.
The most important building block of SEO is the information architecture. If you want your HTML/XHTML, audio, video, and image files to generate qualified search engine traffic, the key ingredient to making these files appear relevant are the information architecture and the interface that communicates this architecture.
Information Architecture versus Interface
Many SEOs utilize the terms “information architecture” and “site architecture” without truly understanding their meaning. Information architecture refers to the organization of site content into groups. Navigation is part of the user interface. Unfortunately, too many SEOs confuse a site’s actual information architecture with the interface.
How is your site’s information grouped? Are all video files (if used) placed in a directory labeled “videos” (or some other relevant label), and do you give search engines easy access to those files? When you use videos on Web pages, are they used as eye candy or to highlight relevant concepts on key pages?
How you arrange information on a Web page communicates to search engines and site visitors alike the content you believe is important. Keywords are a part of that interface.
The Meaning of Search
Interestingly, as the search industry has evolved, it seems the word “search” has come to mean only the querying process. In other words, type in two to three keywords into Google and get results — that’s search.
Search behavior doesn’t only encompass the querying process. Scanning is also a search process. When people view search results, they scan the page for information. When people click a search result to get to a Web page, they scan the Web page to determine whether or not the page’s content matches their search query.
People navigate Web sites. They look for visual cues that lead them in the right direction. Some of the visual cues are textual: titles , headings, etc. Some of the visual cues are graphical: navigation buttons, image maps, photos, etc. Browsing, reading, and scanning are also types of search behaviors.
Many SEOs and their clients need to take their blinders off. There’s a plethora of search behavior outside of rankings. I find the obsession with ranking to be rather narrow-minded and annoying, especially from people who should know better.
Why do I bring up search behavior? In order for a Web site to be search-engine and user friendly, the site’s information architecture and interface must accommodate a wide variety of directed and semi-directed search behavior, not just querying behavior.
Search Usability Analysis
One of my SEO responsibilities is to conduct heuristic analyses of completed Web sites or advanced prototypes. I not only evaluate the user friendliness of the site, I also evaluate the search friendliness of the site.
Sometimes, the analysis turns out in the client’s favor. All the client site might need are some copywriting adjustments, some high-quality link development, and a site map. But most of the time, the core problem lies with the site’s information architecture and poor interface.
Granted, not all copywriting changes are easy. Some ClickZ readers have written to me about content management systems (CMS) that generate the same HTML title tag for every page on a site. Imagine having to write over 100,000 unique title tags from scratch when moving to a better CMS.
I find the most expensive change to make to a site involves restructuring a site’s information architecture after the site has launched. If I can catch a site in the planning or early prototype stages, I can help the site be 100 percent search-engine friendly. If I catch the site after it’s been built, it’s more challenging because, all too often, upper management has fallen in love with the pretty interface and doesn’t want to change it.
If I had one piece of advice for everyone about SEO and information architecture, it’s this: bring in a search usability specialist early in the design, redesign, or early prototype stages. Likewise, if you want to purchase a new CMS for your site, bring in an SEO professional to evaluate them. A search usability analysis can save your organization thousands, or even millions, of dollars in advertising expenses.
More Articles at www.linkchili.com
A listing of all sites on the Web, sorted by traffic…
Alexa computes traffic rankings by analyzing the Web usage of millions of Alexa Toolbar users and data obtained from other, diverse traffic data sources. The information is sorted, sifted, anonymized, counted, and computed, until, finally, we get the traffic rankings shown in the Alexa service. The process is relatively complex, but if you have a need to know, please read on.
The traffic rank is based on three months of aggregated historical traffic data from millions of Alexa Toolbar users and data obtained from other, diverse traffic data sources, and is a combined measure of page views and users (reach). As a first step, Alexa computes the reach and number of page views for all sites on the Web on a daily basis. The main Alexa traffic rank is based on a value derived from these two quantities averaged over time (so that the rank of a site reflects both the number of users who visit that site as well as the number of pages on the site viewed by those users). The three-month change is determined by comparing the site’s current rank with its rank from three months ago. For example, on July 1, the three-month change would show the difference between the rank based on traffic during the first quarter of the year and the rank based on traffic during the second quarter.
Traffic is computed for sites, which are typically defined at the domain level. For example, the Web hosts http://www.msn.com, carpoint.msn.com and slate.msn.com are all treated as part of the same site, because they all reside on the same domain, msn.com. An exception is blogs or personal home pages, which are treated separately if they can be automatically identified as such from the URLs in question. Also, sites which are found to be serving the same content (mirrors) are generally counted together as the same site.
Reach measures the number of users. Reach is typically expressed as the percentage of all Internet users who visit a given site. So, for example, if a site like yahoo.com has a reach of 28%, this means that of all global Internet users measured by Alexa, 28% of them visit yahoo.com. Alexa’s one-week and three-month average reach are measures of daily reach, averaged over the specified time period. The three-month change is determined by comparing a site’s current reach with its values from three months ago.
Page views measure the number of pages viewed by site visitors. Multiple page views of the same page made by the same user on the same day are counted only once. The page views per user numbers are the average numbers of unique pages viewed per user per day by the visitors to the site. The three-month change is determined by comparing a site’s current page view numbers with those from three months ago.
How Are Traffic Trend Graphs Calculated?
The Trend graph shows you a three-day moving average of the site’s daily traffic rank, charted over time. The daily traffic rank reflects the traffic to the site based on data for a single day. In contrast, the main traffic rank shown in the Alexa Toolbar and elsewhere in the service is calculated from three months of aggregate traffic data.
Daily traffic rankings will sometimes benefit sites with sporadically high traffic, while the three-month traffic ranking benefits sites with consistent traffic over time. Since we feel that consistent traffic is a better indication of a site’s value, we’ve chosen to use the three-month traffic rank to represent the site’s overall popularity. We use the daily traffic rank in the Trend graphs because it allows you to see short-term fluctuations in traffic much more clearly.
It is possible for a site’s three-month traffic rank to be higher than any single daily rank shown in the Trend graph. On any given day there may be many sites that temporarily shoot up in the rankings. But if a site has consistent traffic performance, it may end up with the best ranking when the traffic data are aggregated into the three-month average. A good analogy is a four-day golf tournament: if a different player comes in first at each match, but you come in second at all four matches, you can end up winning the tournament.
What is Data Normalization?
Alexa’s ranking methodology corrects for a large number of potential biases and calculates the ranks accordingly. We normalize based on the geographic location of site visitors. We correct for biases in the demographic distribution of site visitors. We correct for potential biases in the data collected from our Alexa Toolbar to better represent those types of site visitors who might not use an Alexa Toolbar.
The movers and shakers list is based on changes in average reach (numbers of users). For each site on the net, we compute the average weekly reach and compare it with the average reach during previous weeks. The more significant the change, the higher the site will be on the list. The percent change shown on the Movers & Shakers list is based on the change in reach. It is important to note that the traffic rankings shown on the Movers & Shakers page are weekly traffic rankings; they are not the same as the three-month average traffic rankings shown in the other Alexa services and are not the same as the reach numbers used to generate the list.
Some Important Disclaimers
The traffic data are based on the set of toolbars that use Alexa data, which may not be a representative sample of the global Internet population. To the extent that our sample of users differs from the set of all Internet users, our traffic estimates may over- or under-estimate the actual traffic to any particular site.
In some cases traffic data may also be adversely affected by our “site” definitions. With tens of millions of hosts on the Internet, our automated procedures for determining which hosts are serving the “same” content may be incorrect and/or out-of-date. Similarly, the determinations of domains and home pages may not always be accurate.
Sites with relatively low traffic will not be accurately ranked by Alexa. Alexa’s data comes from a large sample of several million Alexa Toolbar users and other traffic data sources; however, the size of the Web and concentration of users on the most popular sites make it difficult to accurately determine the ranking of sites with fewer than 1,000 monthly visitors. Generally, traffic rankings of 100,000 and above should be regarded as not reliable. Conversely, the closer a site gets to #1, the more reliable its traffic ranking becomes.
More Articles at www.linkchili.com