Prioritize Your Social Media Efforts
http://ping.fm/y2D1P
Wednesday, November 18, 2009
Tuesday, October 27, 2009
Monday, October 26, 2009
Thursday, October 22, 2009
Yahoo! posts 3-fold jump in net income; sales slip 12%
http://timesofindia.indiatimes.com/business/international-business/Yahoo-posts-3-fold-jump-in-net-income-sales-slip-12/articleshow/5144766.cms
http://timesofindia.indiatimes.com/business/international-business/Yahoo-posts-3-fold-jump-in-net-income-sales-slip-12/articleshow/5144766.cms
Friday, September 18, 2009
By purchasing the Campustags Discount Card, you are agreeing to the following specific terms and conditions of the Card Agreement. You understand that Campustags and http://ping.fm/WEZwy
By purchasing the Campustags Discount Card, you are agreeing to the following specific terms and conditions of the Card Agreement. You understand that Campustags and its subsidiaries and vendors and partners (collectively, "Campustags") make available online information and services to cardholders of this program ("Cardholders").
http://ping.fm/vx1Cv
http://ping.fm/vx1Cv
Thursday, September 17, 2009
Thursday, August 27, 2009
Flickr tops TIME's list of Best 50 Websites of 2009
The hottest thing on the Internet is not social networking websites like Facebook and Twitter, but Flickr-the popular photo-sharing portal - and the proof is: it has topped TIME's list of the best 50 websites this year.
One of the noticeable trends in this year's list, which was released this week, was on-demand video services, like YouTube, Vimeo and US services Hulu and Netflix.
However, the top two in the list were related to photographs, with California Coastline following Flickr at the second spot.
Third in the list was bookmark website Delicious, while community weblog Metafilter stood at the fourth place.
Popurls, the mashup of the web's most visited social news sites and portals, grabbed the fifth spot in the list.
Twitter ranked sixth and Facebook came 31st in the list, while YouTube and Hulu came at 12th and 14th place in the list.
TIME's list of 50 Best Websites of 2009 is:. Flickr
2. California Coastline
3. Delicious
4. Metafilter
5. popurls
6. Twitter
7. Skype
8. Boing Boing
9. Academic Earth
10. OpenTable
11. Google
12. YouTube
13. Wolfram|Alpha
14. Hulu
15. Vimeo
16. Fora TV
17. Craiglook
18. Shop Goodwill
19. Amazon
20. Kayak
21. Netflix
22. Etsy
23. PropertyShark.com
24. Redfin
25. Wikipedia
26. Internet Archive
27. Kiva
28. ConsumerSearch
29. Metacritic
30. Pollster
31. Facebook
32. Pandora and Last.fm
33. Musicovery
34. Spotify
35. Supercook
36. Yelp
37. Visuwords
38. CouchSurfing
39. BabyNameWizard.com's NameVoyager
40. Mint
41. TripIt
42. Aardvark
43. drop.io
44. Issuu
45. Photosynth
46. OMGPOP
47. WorldWideTelescope
48. Fonolo
49. Get High Now
50. Know Your Meme (ANI)
One of the noticeable trends in this year's list, which was released this week, was on-demand video services, like YouTube, Vimeo and US services Hulu and Netflix.
However, the top two in the list were related to photographs, with California Coastline following Flickr at the second spot.
Third in the list was bookmark website Delicious, while community weblog Metafilter stood at the fourth place.
Popurls, the mashup of the web's most visited social news sites and portals, grabbed the fifth spot in the list.
Twitter ranked sixth and Facebook came 31st in the list, while YouTube and Hulu came at 12th and 14th place in the list.
TIME's list of 50 Best Websites of 2009 is:. Flickr
2. California Coastline
3. Delicious
4. Metafilter
5. popurls
6. Twitter
7. Skype
8. Boing Boing
9. Academic Earth
10. OpenTable
11. Google
12. YouTube
13. Wolfram|Alpha
14. Hulu
15. Vimeo
16. Fora TV
17. Craiglook
18. Shop Goodwill
19. Amazon
20. Kayak
21. Netflix
22. Etsy
23. PropertyShark.com
24. Redfin
25. Wikipedia
26. Internet Archive
27. Kiva
28. ConsumerSearch
29. Metacritic
30. Pollster
31. Facebook
32. Pandora and Last.fm
33. Musicovery
34. Spotify
35. Supercook
36. Yelp
37. Visuwords
38. CouchSurfing
39. BabyNameWizard.com's NameVoyager
40. Mint
41. TripIt
42. Aardvark
43. drop.io
44. Issuu
45. Photosynth
46. OMGPOP
47. WorldWideTelescope
48. Fonolo
49. Get High Now
50. Know Your Meme (ANI)
Thursday, August 20, 2009
Tuesday, August 4, 2009
Wednesday, July 29, 2009
Friday, June 26, 2009
Friday, June 5, 2009
Satya Nadella Keynote At Bing Search Summit
Satya Nadella, SVP of Microsoft’s Online Services Division, gave the morning keynote at the Microsoft Search Summit. It was an introduction to and tour of Bing and adCenter upgrades and improvements. Nadella began with a review of the search market and its growth. He was initially apologetic to the audience about Microsoft’s market share. He proceeded to outline the problems with the current state of search that Bing tries to address.
Nadella said “only 1 in 4 queries deliver successful results.” This is based on Microsoft’s observation of search user behavior from historical Live Search logs and its toolbar installs (where they get visibility on search behavior on other engines). Repeat queries or refinements and abandonments indicate current dissatisfaction or deficiencies of the current state of search.
source:
http://searchengineland.com/satya-nadella-keynote-at-bing-search-summit-20476
“In the quest to find the perfect search engine, we still have a lot of room.”
Nadella explains that people engage in long search sessions. Almost 50% of time spent searching is spent during sessions longer than 30 minutes. But those sessions, according to Microsoft, represent only 5% of search sessions overall. He also showed the following consumer data focused on Microsoft’s four strategic verticals.
The slide shows 66% of people are using search more frequently as a decision-making tool; and in their strategic verticals:
* 75% product purchases
* 62% Local activity
* 45% Flight or hotel
* 43% Healthcare
Nadella explains Bing’s “task orientation” and begins a hands-on walk-through of the site. He shows the homepage and discusses its strong “emotional appeal.” He says that among consumers it’s one of the most liked features of the site. Then he takes us on a tour of the site and concretely points out the features (e.g., Best Match, Instant Answers) that are designed to minimize clicks and respond to typical user behaviors.
He shows a local search “San Diego Events” and points out a range of information about events but also about San Diego more broadly. He discusses health search and authoritative answers, with health-related content and articles that can be read on the SERP. Nadella goes on to discuss shopping and the range of information that can be obtained on the SERP without having to click away. In general, what these and other examples collectively show is the deeper integration of verticals and related vertical content into the search result (to avoid too many clicks and the back button).
Satya Nadella, SVP of Microsoft’s Online Services Division, gave the morning keynote at the Microsoft Search Summit. It was an introduction to and tour of Bing and adCenter upgrades and improvements. Nadella began with a review of the search market and its growth. He was initially apologetic to the audience about Microsoft’s market share. He proceeded to outline the problems with the current state of search that Bing tries to address.
Nadella said “only 1 in 4 queries deliver successful results.” This is based on Microsoft’s observation of search user behavior from historical Live Search logs and its toolbar installs (where they get visibility on search behavior on other engines). Repeat queries or refinements and abandonments indicate current dissatisfaction or deficiencies of the current state of search.
source:
http://searchengineland.com/satya-nadella-keynote-at-bing-search-summit-20476
“In the quest to find the perfect search engine, we still have a lot of room.”
Nadella explains that people engage in long search sessions. Almost 50% of time spent searching is spent during sessions longer than 30 minutes. But those sessions, according to Microsoft, represent only 5% of search sessions overall. He also showed the following consumer data focused on Microsoft’s four strategic verticals.
The slide shows 66% of people are using search more frequently as a decision-making tool; and in their strategic verticals:
* 75% product purchases
* 62% Local activity
* 45% Flight or hotel
* 43% Healthcare
Nadella explains Bing’s “task orientation” and begins a hands-on walk-through of the site. He shows the homepage and discusses its strong “emotional appeal.” He says that among consumers it’s one of the most liked features of the site. Then he takes us on a tour of the site and concretely points out the features (e.g., Best Match, Instant Answers) that are designed to minimize clicks and respond to typical user behaviors.
He shows a local search “San Diego Events” and points out a range of information about events but also about San Diego more broadly. He discusses health search and authoritative answers, with health-related content and articles that can be read on the SERP. Nadella goes on to discuss shopping and the range of information that can be obtained on the SERP without having to click away. In general, what these and other examples collectively show is the deeper integration of verticals and related vertical content into the search result (to avoid too many clicks and the back button).
Google Squared Is Now Live
Google Squared Is Now Live
Posted using ShareThis
Google has pulled the covers off of Squared, the search tool that was first demoed during the Searchology event last month.
Google Squared takes a search query and tried to present the results in tabular form — like an online spreadsheet of sorts. Sometimes it works very well and provides interesting results, and other times not so much. Here’s a screenshot of the “very well” example, a search for [u2 albums]:
Google Squared
Squared is very configurable; you can add and delete rows and/or columns, and sometimes replace data that Squared puts in your results. It’s a search tool that’s still part of Google Labs, and frankly, its overall usefulness is questionable. But that, presumably, is exactly what Google Labs is for — testing new ideas and seeing what sticks.
An official announcement of today’s launch is expected soon.
Posted using ShareThis
Google has pulled the covers off of Squared, the search tool that was first demoed during the Searchology event last month.
Google Squared takes a search query and tried to present the results in tabular form — like an online spreadsheet of sorts. Sometimes it works very well and provides interesting results, and other times not so much. Here’s a screenshot of the “very well” example, a search for [u2 albums]:
Google Squared
Squared is very configurable; you can add and delete rows and/or columns, and sometimes replace data that Squared puts in your results. It’s a search tool that’s still part of Google Labs, and frankly, its overall usefulness is questionable. But that, presumably, is exactly what Google Labs is for — testing new ideas and seeing what sticks.
An official announcement of today’s launch is expected soon.
Tuesday, June 2, 2009
Tuesday, May 26, 2009
A glossary of SEO terms
Within any professional industry there are some in house terms and phrases used that need definition and clarity to the uninformed. Search engine optimization is no different. I’ve written some post in the past that mention some of these, without elaborating on them - this post will hopefully serve as a reference post as time goes on, educating you guys who aren’t familiar with the terminology. It’s very easy to include some of these phrases within my writing, without realising that some people may not have a clue what I’m on about - so for completeness, here’s my SEO glossary.
Blackhats / Whitehats
Blackhat generally refers to a search engine optimiser who doesn’t play by the rules, and is constantly pushing the boundaries of what the major search engines will allow. If it is well known that for example links are important to rank well, a blackhat will perhaps spam or create automated programs to put links around the web. I think the term initially came from some connection with wizardry - search engine optimisers being the wizards of the web? Black being evil, white representing pure or good. Whitehats on the other hand are search engine optimisation professionals that plays by the rules and follows things like Google’s guidelines. A whitehat will err on the side of caution, and takes care not to trip any Google penalties, a Blackhat on the other hand is normally involved with naughty tactics on throw away domains, and uses this knowledge to bolster his Whitehat efforts.
Sitewide
A sitewide link is one which is available through your site, for example in the footer or on a menu which exists on every page within your website. A sitewide link will commonly not pass as much link juice as other links.
Nofollow
A link which carries the rel=”nofollow” attribute - this was a measure used to combat spam, and is implemented within common platforms such as Wordpress to avoid comment spam. A link with the rel=”nofollow” attribute, doesn’t carry any weight, and doesn’t count towards your overall pagerank.
Dofollow
A standard link which hasn’t been No followed. This type of link will pass page rank to another website or page on a website.
Link juice
Commonly used with the term Pagerank - Link juice can be defined as the amount of influence or Page rank being passed between webpages.
Pagerank
PageRank is Google’s algorithmic way of deciding a page’s importance. It is determined by the number of inbound links pointing at the page and used (as one of the many factors) in determining the results in the SERPS.
On Page techniques
Google uses both on page and off page techniques to determine the results in the SERPs - on page techniques refer to changes made to the web page code which help align the physical page code with relevance for a particular search phrase or term.
Off Page techniques
Off page techniques are used to enhance the relevance of a webpage by gaining links. Off page links may include traditional link building or link baiting
Traditional link building
Traditional link building refers to the practise of obtaining links from third party websites. This will commonly be performed by a human, asking for link exchanges from other webmasters, submitting websites to relevant directories or commenting on blogs which are dofollow.
Link Exchanges
Link exchanges or link swaps are used between two webmasters to share traffic and link juice between two websites. They are frowned upon generally as the two links cancel each other out, and look spammy. If in doubt don’t exchange links - you may end up getting into a bad neighborhoods.
Search phrase
A search phrase is the keywords someone uses to find your website in Google. Some search phrases are in more competitive niche’s than others, and are harder to rank for.
Link graph
When used in context a link graph represents the network of links that connect sites together. It is the overall picture of how a site is linked to and from.
Link bait
Link bait is content written solely for the purpose of gaining additional new links and a high influx of new traffic - simply from the content nature. I’ve written a post on link bait over here for further reading.
List bait
List bait is the same as link bait but takes the format of a list. e.g. 10 amazing widgets or 25 top tips for x y or z.
Competitive Niche
A hard to enter market online which caters for a particular niche, but is tough to rank for. For example pharaceuticals or mortgages. As the price paid per click is high for Adsense adverts,many people chase terms such as mortgages.
Keyword Research
Keyword research is the practise of working out how much potential traffic a keyword gets. See more over here. It may also cover Adsense keyword research (to figure out how much Adsense income a potential keyword may bring a website owner).
Keyword Density
Used within on page techniques keyword density refers to the distribution of a keyword within a web page. Each search engine favours its own keyword density. To figure out the keyword density get the word count of your webpage (WC), count the occurance of a particular keyword (OCC) in it, and divide into it. Want to know what the major search engines favour ? Have a peak at some of this keyword research.
(OCC / WC) * 100 = Keyword Density for a phrase
Internal Links
Internal links are any links which link (internally) to other pages on that website. A good internal linking structure is imperative for good SEO results. Linking back and forth with good link text can help Google determine what your website is about, and thus increase the likelyhood of you being found for particular terms. Internal links also help to improve the overall number of page views your website receives, as the pages are well linked together.
Outbound Links
Simply put - outbound links are links which link to other websites, and are sometimes known as external links.
Trust Rank
Trust Rank is a link analysis technique that many SEO’s believe is present somewhere within Google’s ranking algorithm, that uses the research conducted (PDF link) at Yahoo and Stanford University for identifying spam.
Link Text or Anchor Text
Link Text or Anchor Text are the words which are underlined when a link is created. For example Web Design Ireland - this will aid Google in identifying what a site is about. SEO’s commonly use the link text of a link to increase relevance for keywords or phrases.
Spider
A spider in the context of SEO, is an automated program which is used to collect and or mine data. In order for Google to find out about your website, it has to spider (or crawl) the web, clicking on link after link to find your website. The more links to your site, and the more frequently you update your content the more frequently you will get crawled. The information Google picks up from your webpage content is correlated in Google’s databases, and after your ranking has been decided, shown in the SERP’s. A search engine spider is also sometimes known as a robot.
Robots.txt
A robots.txt file contains instructions for spiders or robots on which pages they are allowed to index. If you wish to disallow access to certain parts of your website and not get listed for that page in Google, you need to have a robots.txt file in place. This is a good resource on the usage of a robots.txt file.
Backlink
A backlink is a link obtained from a third party website to a page on your website. The more of these that exist the more traffic you will receive. In addition, multiple backlinks has the added advantage of boosting your pagerank, and increasing your relevance in Google. Backlinks are sometimes referred to as inbounds.
301 Redirect
There are a variety of status headers that SEO optimisers need to be aware of, but one of the most important is the 301 redirect. This allows search engines to determine when a page has changed from one part of a site to another, or if a site has migrated from a subdomain to a main domain - or indeed completely changed. You can check the headers a particular webpage has, but if you are migrating, ask an expert.
Bad Neighbourhood
Bad neighbourhood’s describe areas on the web that have been penalised by Google in the past, and have engaged in dubious linking practises or cloaking. Gaining an inbound or more importantly adding outbound links to bad neighbourhoods can hamper your SEO efforts at best, or at worst; get you completely banned from Google.
Cloaking
Cloaking refers to the practise of sending human visitors one copy of a page, and the site engine spiders another. It is commonly performed by detected the User Agent of a browser or spider, and sending alternative content to Google. The perceived advantage to this is that keyword densities, link structures and other search engine ranking factors can be manipulated further without worrying about the readability of a page, or the navigation for humans. It is generally a serious faux pas to engage in cloaking of any kind, and it is well known to be a Blackhat technique.
Deep Linking
Deep linking refers to obtaining inbound links to content which is buried (deep) inside your site. Generally the majority of links to your website will hit the homepage. It is better to achieve links to content off the home page, which will improve the pagerank distribution across multiple pages. These are known as deep links.
Black Hole
A black hole site is created when a large tier 1 authoritarian site stops providing outbound links - or if it does provide outbound links they are made no follow. If another source is needed, another page is provided on the site for the citation, and as a result all inbound link juice is retained within the blackhole. A great example of this would be Wikipedia, which tends to dominate the SERPs for some keywords. A few newspaper sites have started to create blackholes to try and retain their linkjuice.
Rank
Rank is used to describe where abouts a particular site appears within the SERPs for particular keywords. Sometimes you will hear of SEO professionals talking about outranking someone else. This simply means that they have overtaken them in the SERPs.
SERPS
SERPS stands for search engine results page. It is the first page you see after you hit search on any major search engine, and lists the results for your particular search query.
Google Sandbox
The Google Sandbox is conceptually a place that new domains sit for a while once they are launched with a algorithmic lowered page rank. No one knows if the sandbox exists or not, and many dispute its existance. Matt Cutts has stated in an interview however that “there are some things in the algorithm that may be perceived as a sandbox that doesn’t apply to all industries”.
Domain Age
Domain age refers to how long a particular domain has been registered for. It is thought to be a ranking factor inside the Google algorithm, with older domains thought to be more likely to be relevant than new ones. Again this is something that is fraught with heresay.
User Agent
The User agent is the client application identifier that is used to access a web page. For example if you access a web page using Internet Explorer the user agent will contain a string that pertains to it - e.g. MSIE 9.0 beta or if the user agent is a search engine spider is browsing a web page it will likely identify itself as a bot of some sort. For example Google identifies its search engine spider as Googlebot. This is used by web analytics software such as Google Analytics.
Bait and Switch
The practise of attracting links for a term, think once ranking has been achieved for it, switching the content on the page. May be used for a landing page for a service or product.
Blackhats / Whitehats
Blackhat generally refers to a search engine optimiser who doesn’t play by the rules, and is constantly pushing the boundaries of what the major search engines will allow. If it is well known that for example links are important to rank well, a blackhat will perhaps spam or create automated programs to put links around the web. I think the term initially came from some connection with wizardry - search engine optimisers being the wizards of the web? Black being evil, white representing pure or good. Whitehats on the other hand are search engine optimisation professionals that plays by the rules and follows things like Google’s guidelines. A whitehat will err on the side of caution, and takes care not to trip any Google penalties, a Blackhat on the other hand is normally involved with naughty tactics on throw away domains, and uses this knowledge to bolster his Whitehat efforts.
Sitewide
A sitewide link is one which is available through your site, for example in the footer or on a menu which exists on every page within your website. A sitewide link will commonly not pass as much link juice as other links.
Nofollow
A link which carries the rel=”nofollow” attribute - this was a measure used to combat spam, and is implemented within common platforms such as Wordpress to avoid comment spam. A link with the rel=”nofollow” attribute, doesn’t carry any weight, and doesn’t count towards your overall pagerank.
Dofollow
A standard link which hasn’t been No followed. This type of link will pass page rank to another website or page on a website.
Link juice
Commonly used with the term Pagerank - Link juice can be defined as the amount of influence or Page rank being passed between webpages.
Pagerank
PageRank is Google’s algorithmic way of deciding a page’s importance. It is determined by the number of inbound links pointing at the page and used (as one of the many factors) in determining the results in the SERPS.
On Page techniques
Google uses both on page and off page techniques to determine the results in the SERPs - on page techniques refer to changes made to the web page code which help align the physical page code with relevance for a particular search phrase or term.
Off Page techniques
Off page techniques are used to enhance the relevance of a webpage by gaining links. Off page links may include traditional link building or link baiting
Traditional link building
Traditional link building refers to the practise of obtaining links from third party websites. This will commonly be performed by a human, asking for link exchanges from other webmasters, submitting websites to relevant directories or commenting on blogs which are dofollow.
Link Exchanges
Link exchanges or link swaps are used between two webmasters to share traffic and link juice between two websites. They are frowned upon generally as the two links cancel each other out, and look spammy. If in doubt don’t exchange links - you may end up getting into a bad neighborhoods.
Search phrase
A search phrase is the keywords someone uses to find your website in Google. Some search phrases are in more competitive niche’s than others, and are harder to rank for.
Link graph
When used in context a link graph represents the network of links that connect sites together. It is the overall picture of how a site is linked to and from.
Link bait
Link bait is content written solely for the purpose of gaining additional new links and a high influx of new traffic - simply from the content nature. I’ve written a post on link bait over here for further reading.
List bait
List bait is the same as link bait but takes the format of a list. e.g. 10 amazing widgets or 25 top tips for x y or z.
Competitive Niche
A hard to enter market online which caters for a particular niche, but is tough to rank for. For example pharaceuticals or mortgages. As the price paid per click is high for Adsense adverts,many people chase terms such as mortgages.
Keyword Research
Keyword research is the practise of working out how much potential traffic a keyword gets. See more over here. It may also cover Adsense keyword research (to figure out how much Adsense income a potential keyword may bring a website owner).
Keyword Density
Used within on page techniques keyword density refers to the distribution of a keyword within a web page. Each search engine favours its own keyword density. To figure out the keyword density get the word count of your webpage (WC), count the occurance of a particular keyword (OCC) in it, and divide into it. Want to know what the major search engines favour ? Have a peak at some of this keyword research.
(OCC / WC) * 100 = Keyword Density for a phrase
Internal Links
Internal links are any links which link (internally) to other pages on that website. A good internal linking structure is imperative for good SEO results. Linking back and forth with good link text can help Google determine what your website is about, and thus increase the likelyhood of you being found for particular terms. Internal links also help to improve the overall number of page views your website receives, as the pages are well linked together.
Outbound Links
Simply put - outbound links are links which link to other websites, and are sometimes known as external links.
Trust Rank
Trust Rank is a link analysis technique that many SEO’s believe is present somewhere within Google’s ranking algorithm, that uses the research conducted (PDF link) at Yahoo and Stanford University for identifying spam.
Link Text or Anchor Text
Link Text or Anchor Text are the words which are underlined when a link is created. For example Web Design Ireland - this will aid Google in identifying what a site is about. SEO’s commonly use the link text of a link to increase relevance for keywords or phrases.
Spider
A spider in the context of SEO, is an automated program which is used to collect and or mine data. In order for Google to find out about your website, it has to spider (or crawl) the web, clicking on link after link to find your website. The more links to your site, and the more frequently you update your content the more frequently you will get crawled. The information Google picks up from your webpage content is correlated in Google’s databases, and after your ranking has been decided, shown in the SERP’s. A search engine spider is also sometimes known as a robot.
Robots.txt
A robots.txt file contains instructions for spiders or robots on which pages they are allowed to index. If you wish to disallow access to certain parts of your website and not get listed for that page in Google, you need to have a robots.txt file in place. This is a good resource on the usage of a robots.txt file.
Backlink
A backlink is a link obtained from a third party website to a page on your website. The more of these that exist the more traffic you will receive. In addition, multiple backlinks has the added advantage of boosting your pagerank, and increasing your relevance in Google. Backlinks are sometimes referred to as inbounds.
301 Redirect
There are a variety of status headers that SEO optimisers need to be aware of, but one of the most important is the 301 redirect. This allows search engines to determine when a page has changed from one part of a site to another, or if a site has migrated from a subdomain to a main domain - or indeed completely changed. You can check the headers a particular webpage has, but if you are migrating, ask an expert.
Bad Neighbourhood
Bad neighbourhood’s describe areas on the web that have been penalised by Google in the past, and have engaged in dubious linking practises or cloaking. Gaining an inbound or more importantly adding outbound links to bad neighbourhoods can hamper your SEO efforts at best, or at worst; get you completely banned from Google.
Cloaking
Cloaking refers to the practise of sending human visitors one copy of a page, and the site engine spiders another. It is commonly performed by detected the User Agent of a browser or spider, and sending alternative content to Google. The perceived advantage to this is that keyword densities, link structures and other search engine ranking factors can be manipulated further without worrying about the readability of a page, or the navigation for humans. It is generally a serious faux pas to engage in cloaking of any kind, and it is well known to be a Blackhat technique.
Deep Linking
Deep linking refers to obtaining inbound links to content which is buried (deep) inside your site. Generally the majority of links to your website will hit the homepage. It is better to achieve links to content off the home page, which will improve the pagerank distribution across multiple pages. These are known as deep links.
Black Hole
A black hole site is created when a large tier 1 authoritarian site stops providing outbound links - or if it does provide outbound links they are made no follow. If another source is needed, another page is provided on the site for the citation, and as a result all inbound link juice is retained within the blackhole. A great example of this would be Wikipedia, which tends to dominate the SERPs for some keywords. A few newspaper sites have started to create blackholes to try and retain their linkjuice.
Rank
Rank is used to describe where abouts a particular site appears within the SERPs for particular keywords. Sometimes you will hear of SEO professionals talking about outranking someone else. This simply means that they have overtaken them in the SERPs.
SERPS
SERPS stands for search engine results page. It is the first page you see after you hit search on any major search engine, and lists the results for your particular search query.
Google Sandbox
The Google Sandbox is conceptually a place that new domains sit for a while once they are launched with a algorithmic lowered page rank. No one knows if the sandbox exists or not, and many dispute its existance. Matt Cutts has stated in an interview however that “there are some things in the algorithm that may be perceived as a sandbox that doesn’t apply to all industries”.
Domain Age
Domain age refers to how long a particular domain has been registered for. It is thought to be a ranking factor inside the Google algorithm, with older domains thought to be more likely to be relevant than new ones. Again this is something that is fraught with heresay.
User Agent
The User agent is the client application identifier that is used to access a web page. For example if you access a web page using Internet Explorer the user agent will contain a string that pertains to it - e.g. MSIE 9.0 beta or if the user agent is a search engine spider is browsing a web page it will likely identify itself as a bot of some sort. For example Google identifies its search engine spider as Googlebot. This is used by web analytics software such as Google Analytics.
Bait and Switch
The practise of attracting links for a term, think once ranking has been achieved for it, switching the content on the page. May be used for a landing page for a service or product.
Friday, April 24, 2009
My first Company wide Meeting in Cvent Inc.
“What you should know about Social Media for corporations”
Social Media in corporations - are you ready?:
* What do you know about your customers in the social web?
* Do you know what customers say about you and your brand?
* Do you know how open your customer base is and therefore how vulnerable you are?
* How do you identify and work with key influencer?
* Are you ready if your competitors go after your customers in the social web?
* Are you able to create a social media strategy?
* Do you know how to leverage the social web for your support organization?
* Do you have an idea about ROI and effectiveness of social media?
* Do you know how to measure improvements and success in the social web?
* Did you ever consider involving and leveraging your partners?
* Did it occur to you that the social web may be ideal to compete for mindshare?
* Do you have enough information to decide whether to ignore or engage?
* What do you know about your customers in the social web?
* Do you know what customers say about you and your brand?
* Do you know how open your customer base is and therefore how vulnerable you are?
* How do you identify and work with key influencer?
* Are you ready if your competitors go after your customers in the social web?
* Are you able to create a social media strategy?
* Do you know how to leverage the social web for your support organization?
* Do you have an idea about ROI and effectiveness of social media?
* Do you know how to measure improvements and success in the social web?
* Did you ever consider involving and leveraging your partners?
* Did it occur to you that the social web may be ideal to compete for mindshare?
* Do you have enough information to decide whether to ignore or engage?
Thursday, April 23, 2009
100 most useful sites on the Internet
The best sites for saving money
Bargaineering
Consumerism Commentary
The Dollar Stretcher
Financial Integrity
Get Rich Slowly
The Simple Dollar
The Simple Living Network
Smart Spending
Wise Bread
The best sites for savvier spending
Angie's List
BillShrink
The Budget Fashionista
Consumer Reports
The Consumerist
Edmunds.com
ePinions
FreeShipping.org
Red Tape Chronicles
Shop It To Me
ShopLocal.com
Best sites for bargain hunting
Ben's Bargains
DealNews
Ebates
FatWallet
MyBargainBuddy.com
Slickdeals
Best sites for grocery savings
CouponMom.com
The Grocery Game
Hot Coupon World
Penny Pincher Gazette
Best sites for coupons
Alex's Coupons
CouponCabin
CouponCode.com
Coupon Mountain
RetailMeNot.com
Best sites for comparison shopping
BeatMyPrice.com
BeatThat!
DiscountMore.com
Best sites for saving and investing
Bankrate.com
Findacreditunion.com
Morningstar
Financial Engines
Best sites for paying for college
FinAid
Savingforcollege.com
Best sites for managing your credit
AnnualCreditReport.com
CardRatings.com
Credit.com
CreditCards.com
CreditMattersBlog.com
myFICO
Best sites for real estate and mortgages
ThinkGlink.com
Mortgage Professor's Web Site
HUD.gov
Making Home Affordable
Best sites for free government help
Federal Citizen Information Center
The Federal Reserve
Govbenefits.gov
Home Energy Saver
Mymoney.gov
Best sites for insurance
Insure.com
United Policyholders
Best sites for doing it yourself
Fix-It Club
Instructables
Nolo
Best sites for travel
Farecast
Kayak
MouseSavers.com
OneBag
SeatGuru
Theme Park Insider
TripAdvisor
The Universal Packing List
WebFlyer
Best sites for really cheap travel
CouchSurfing
HomeExchange.com
Less than a Shoestring
Best sites for charitable giving
Charity Navigator
DonorsChoose.org
GuideStar
Best sites for productivity and careers
The Blog of Tim Ferriss
The Brazen Careerist
Lifehacker
The Thin Pink Line
WebWorkerDaily
Zen Habits
Best sites for free entertainment
Fancast
Hulu
Pandora
Best sites for freebies
Freebiewatch
Free Stuff Times
Hey, It's Free!
Best sites for swapping stuff
Freecycle
PaperBack Swap
TitleTrader
Best sites for free tech stuff
5 Star Support
OnlyFreewares.com
Mozy
Tech-Recipes
Wi-Fi Free Spot
Zoho
Bargaineering
Consumerism Commentary
The Dollar Stretcher
Financial Integrity
Get Rich Slowly
The Simple Dollar
The Simple Living Network
Smart Spending
Wise Bread
The best sites for savvier spending
Angie's List
BillShrink
The Budget Fashionista
Consumer Reports
The Consumerist
Edmunds.com
ePinions
FreeShipping.org
Red Tape Chronicles
Shop It To Me
ShopLocal.com
Best sites for bargain hunting
Ben's Bargains
DealNews
Ebates
FatWallet
MyBargainBuddy.com
Slickdeals
Best sites for grocery savings
CouponMom.com
The Grocery Game
Hot Coupon World
Penny Pincher Gazette
Best sites for coupons
Alex's Coupons
CouponCabin
CouponCode.com
Coupon Mountain
RetailMeNot.com
Best sites for comparison shopping
BeatMyPrice.com
BeatThat!
DiscountMore.com
Best sites for saving and investing
Bankrate.com
Findacreditunion.com
Morningstar
Financial Engines
Best sites for paying for college
FinAid
Savingforcollege.com
Best sites for managing your credit
AnnualCreditReport.com
CardRatings.com
Credit.com
CreditCards.com
CreditMattersBlog.com
myFICO
Best sites for real estate and mortgages
ThinkGlink.com
Mortgage Professor's Web Site
HUD.gov
Making Home Affordable
Best sites for free government help
Federal Citizen Information Center
The Federal Reserve
Govbenefits.gov
Home Energy Saver
Mymoney.gov
Best sites for insurance
Insure.com
United Policyholders
Best sites for doing it yourself
Fix-It Club
Instructables
Nolo
Best sites for travel
Farecast
Kayak
MouseSavers.com
OneBag
SeatGuru
Theme Park Insider
TripAdvisor
The Universal Packing List
WebFlyer
Best sites for really cheap travel
CouchSurfing
HomeExchange.com
Less than a Shoestring
Best sites for charitable giving
Charity Navigator
DonorsChoose.org
GuideStar
Best sites for productivity and careers
The Blog of Tim Ferriss
The Brazen Careerist
Lifehacker
The Thin Pink Line
WebWorkerDaily
Zen Habits
Best sites for free entertainment
Fancast
Hulu
Pandora
Best sites for freebies
Freebiewatch
Free Stuff Times
Hey, It's Free!
Best sites for swapping stuff
Freecycle
PaperBack Swap
TitleTrader
Best sites for free tech stuff
5 Star Support
OnlyFreewares.com
Mozy
Tech-Recipes
Wi-Fi Free Spot
Zoho
Thursday, April 16, 2009
Top Social Media Sites
Ranked by - Unique worldwide visitors November, 2008)
1. Blogger (222 million)
2. Facebook (200 million)
3. MySpace (126 million)
4. Wordpress (114 million)
5. Windows Live Spaces (87 million)
6. Yahoo Geocities (69 million)
7. Flickr (64 million)
8. hi5 (58 million)
9. Orkut (46 million)
10. Six Apart (46 million)
11. Baidu Space (40 million)
12. Friendster (31 million)
13. 56.com (29 million)
14. Webs.com (24 million)
15. Bebo (24 million)
16. Scribd (23 million)
17. Lycos Tripod (23 million)
18. Tagged (22 million)
19. imeem (22 million)
20. Netlog (21 million)
1. Blogger (222 million)
2. Facebook (200 million)
3. MySpace (126 million)
4. Wordpress (114 million)
5. Windows Live Spaces (87 million)
6. Yahoo Geocities (69 million)
7. Flickr (64 million)
8. hi5 (58 million)
9. Orkut (46 million)
10. Six Apart (46 million)
11. Baidu Space (40 million)
12. Friendster (31 million)
13. 56.com (29 million)
14. Webs.com (24 million)
15. Bebo (24 million)
16. Scribd (23 million)
17. Lycos Tripod (23 million)
18. Tagged (22 million)
19. imeem (22 million)
20. Netlog (21 million)
Wednesday, April 15, 2009
Upcoming change to Google.com search referrals
you may start seeing a new referring URL format for visitors coming from Google search result pages. Up to now, the usual referrer for clicks on search results for the term "flowers", for example, would be something like this:
http://www.google.com/search?hl=en&q=flowers&btnG=Google+Search Now you will start seeing some referrer strings that look like this:
http://www.google.com/url?sa=t&source=web&ct=res&cd=7&url=http%3A%2F%2Fwww.example.com%2Fmypage.htm&ei=0SjdSa-1N5O8M_qW8dQN&rct=j&q=flowers&usg=AFQjCNHJXSUh7Vw7oubPaO3tZOzz-F-u_w&sig2=X8uCFh6IoPtnwmvGMULQfw
The key difference between these two urls is that instead of "/search?" the URL contains a "/url?". If you run your own analyses, be sure that you do not depend on the "/search?" portion of the URL to determine if a visit started with an organic search click. Google Analytics does not depend on the "/search?" string in the referrer, so users of Google Analytics will not notice a difference in their reports, but other analytics packages may need to adapt to this change in our referrer string to maintain accurate reports.
http://www.google.com/search?hl=en&q=flowers&btnG=Google+Search Now you will start seeing some referrer strings that look like this:
http://www.google.com/url?sa=t&source=web&ct=res&cd=7&url=http%3A%2F%2Fwww.example.com%2Fmypage.htm&ei=0SjdSa-1N5O8M_qW8dQN&rct=j&q=flowers&usg=AFQjCNHJXSUh7Vw7oubPaO3tZOzz-F-u_w&sig2=X8uCFh6IoPtnwmvGMULQfw
The key difference between these two urls is that instead of "/search?" the URL contains a "/url?". If you run your own analyses, be sure that you do not depend on the "/search?" portion of the URL to determine if a visit started with an organic search click. Google Analytics does not depend on the "/search?" string in the referrer, so users of Google Analytics will not notice a difference in their reports, but other analytics packages may need to adapt to this change in our referrer string to maintain accurate reports.
Tuesday, April 14, 2009
List of online video sharing websites and video
This summary is not available. Please
click here to view the post.
Subscribe to:
Posts (Atom)