Thursday, December 29, 2011

SEO Best Practices Part 12 - SEO Tools

I have composed a list of what I think are the best free SEO tools, they are in no particular order.


Web Applications

The Google Keyword Tool is by far the most useful Free keyword research tool available, it not only shows search volumes but also how difficult the competition is.

Google Webmaster Tools

Webmaster Tools is perfect for checking keywords you may be ranking for but didn’t know about and also for checking that you have no crawl or html errors on you site.

Google Analytics

This is by far the best website analytics application you can use, with lots of extra features like Goals and Real time data also the ability to be able to link in your Adsense account so you can see which pages earn you the most money is great.

Keyword Density Tool

This tool comes in really handy when working on your on-page SEO, it will scan any page and return to you your most used keywords and the keyword density of each in a percentage

On Page Analysis

The on page analysis tool will scan any page and give you an instant On Page SEO report unlike many other which require you to add in an email and wait for it to arrive.

Majestic SEO Tools

Majestic’s tools are very useful when comparing backlink data between your site and a competitors. Majestic has their own web index and it updates more then Yahoo Site Explorer, it also shows a lot more data.

Google Toolbar

The Google Toolbar is great for checking a websites page rank!

Alexa Toolbar

The Alexa Toolbar makes it easy to check a websites Alexa rank, it also allows people to see what a website used to look like.


WordPress Plugins



All In One SEO

The all in one SEO plugin is really effective and can make your blog much more search engine friendly. It adds the ability to manually add in meta descriptions or to let the plugin auto generate them. It also make it easy to no index the tags and categories.

Redirect Plugin

From time to time you need to setup a redirect and instead of using the WordPress’s basic redirect tool which only sets it up as a 302 redirect, you should be using the Redirect Plugin which allows you to setup 301 redirects easily

WP Super Cache

Site speed is important for your users, it also plays a small part in your search engine rankings, this plugin will most definitely increase your site speed.


Free Downloadable Applications



SEO PowerSuite

Link Assistant AKA SEO PowerSuite provide a great set of tools which include and Rank checker, link assistant and a WebSite Auditor. They offer a free and paid version.

Xenu’s Link Sleuth

Link Sleuth is a great tool for checking broken links, 404 pages and crawl errors.
I will add to this list as I find more useful free tools. If this page has been useful to you please do Tweet + Like it!

Tuesday, December 27, 2011

SEO Best Practices Part 11

Don’t use images to replace text

As designers, we always want to make things look as good as possible. This means sometimes replacing ugly browser rendered heading text with a nice smooth image. Try to avoid doing this. Again, search engines can’t see the contents of an image, and this is where you should be putting your keywords.

Use AJAX sparingly

Ajax is great for enhancing the user experience, but try not to over do it. Content generated with ajax can’t be linked to. A good rule of thumb is: if what you are loading with AJAX can be an individual page, then avoid using it.

Avoid using Flash for navigation

As tempting as it is to whip out some nice looking drop down effects for your site’s navigation using flash, don’t do it. Search engines have trouble reading flash files, which means the links used in the navigation can’t be followed.

Build incoming links

The number and quality of incoming links plays a big role in the placement of your site in search results. Having quality and unique content is a good way to get people to link to your site. Another way is to be generous with your own links.

Use a consistent URL

When you build a site, decide from the beginning if you want to use or drop the “www”. Once you decide, stick with it. Search engines, for example, see www.webdesignledger and webdesignledger.com as two different sites and as duplicate content, which they do not like.

SEO Best Practices Part 10

When your site is ready

Submit your site to search engines

Google: http://www.google.com/addurl.html

Sitemap

Submit a Sitemap using Google Webmaster Tools. Google uses your Sitemap to learn about the structure of your site and to increase our coverage of your Web pages.

Examine your site

Use a text browser such as Lynx to examine your site, because most search engine spiders see your site much as Lynx would. If fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash keep you from seeing your entire site in a text browser, then search engine spiders may have trouble crawling your site.

Browser compatibility

Test your site to make sure that it appears correctly in different browsers.

Validation

Validate your page/site (Ctrl+Shift+a in Firefox). Validation errors will sometimes stop the spider from crawling your site fully and efficiently.

Webmaster tools

SEO Best Practices Part 9

Image Optimization <img>

All images can have a distinct filename and “alt” attribute, both of which you should take advantage of. The “alt” attribute allows you to specify alternative text for the image if it cannot be displayed for some reason. Why use this attribute? If a user is viewing your site on a browser that doesn’t support images, or is using alternative technologies, such as a screen reader, the contents of the alt attribute provide information about the picture.

Use brief, but descriptive filenames and alt text

Like many of the other parts of the page targeted for optimization, filenames and alt text (for ASCII languages) are best when they’re short, but descriptive. Use dashes in naming conventions versus underscores (this-image.jpg vs. this_image.jpg). Also use relevant keywords/key phrases in your alt text.
  • Example:mickeymouse-stamp.jpg
  • Avoid:
    • Using generic filenames like “image1.jpg”, “pic.gif”, “1.jpg” when possible (some sites with thousands of images might consider automating the naming of images)
    • Writing extremely lengthy filenames
  • Never: Stuff keywords into alt text or copy and paste entire sentences

Supply alt text when using images as links

If you do decide to use an image as a link, filling out its alt text helps search engines understand more about the page you’re linking to. Imagine that you’re writing anchor text for a text link.
  • Avoid: Using only image links for your site’s navigation
  • Never: Writing excessively long alt text that would be considered spam

Store images in their own directory

Consider consolidating your images into a single directory (e.g. leroysstampcollecting.com/images/). This simplifies the path to your images.
  • Do Not: Have image files spread out in numerous directories and subdirectories across your domain

Use commonly supported file types

Most browsers support JPEG, GIF, PNG, and BMP image formats. It’s also a good idea to have the extension of your filename match with the file type.

SEO Best Practices Part 8

Anchor Text

Choose descriptive text

The anchor text you use for a link should provide at least a basic idea of what the page linked to is about.
  • Example: Top 10 Rarest Stamps
  • Avoid:
    • Writing generic anchor text like “page”, “article”, or “click here”
    • Using text that is off-topic or has no relation to the content of the page linked to
    • Using the page’s URL as the anchor text in most cases (although there are certainly legitimate uses of this, such as promoting or referencing a new website’s address)

Write concise text

Aim for short but descriptive text – usually a few words or a short phrase.
  • Avoid: Writing long anchor text, such as a lengthy sentence or short paragraph of text

Format links so they’re easy to spot

Make it easy for users to distinguish between regular text and the anchor text of your links. Your content becomes less useful if users miss the links or accidentally click them.
  • Avoid: Using CSS or text styling that make links look just like regular text

Anchor text for internal links

You may usually think about linking in terms of pointing to outside websites, but paying more attention to the anchor text used for internal links can help users and search engines navigate your site better.
  • Avoid:
    • Using excessively keyword-filled or lengthy anchor text just for search engines
    • Creating unnecessary links that don’t help with the user’s navigation of the site

Write an anchor title

Having the link’s text be the text within the anchor’s title attribute will help increase usability and also affect page crawling.
  • Example: <a href=”http://www.leroysstampcollecting.com/collecting/kids” title=”kids stamp collecting”>kids stamp collecting</a>

SEO Best Practices Part 7

Content

Creating compelling and useful content will likely influence your website more than any of the other factors discussed here. Users know good content when they see it and will likely want to direct other users to it.

Write easy-to-read text

Users enjoy content that is well written and easy to follow.
  • Avoid:
    • Writing sloppy text with many spelling and grammatical mistakes
    • Embedding text in images for textual content (users may want to copy and paste the text and search engines can’t read it)

Stay organized around the topic

It’s always beneficial to organize your content so that visitors have a good sense of where one content topic begins and another ends. Breaking your content up into logical chunks or divisions helps users find the content they want faster.
  • Avoid: Dumping large amounts of text on varying topics onto a page without paragraph, subheading, or layout separation

Use relevant language

Think about the words that a user might search for to find a piece of your content. Users who know a lot about the topic might use different keywords in their search queries than someone who is new to the topic. For example, a long-time baseball fan might search for NLCS, an acronym for the National League Championship Series, while a new fan might use a more general query like baseball playoffs. Anticipating these differences in search behavior and accounting for them while writing your content (using a good mix of keyword phrases) could produce positive results. Google AdWords provides a handy Keyword Tool that helps you discover new keyword variations and see the approximate search volume for each keyword. Also, Google Webmaster Tools provides you with the top search queries your site appears for and the ones that led the most users to your site.

Create fresh, unique content

New content will not only keep your existing visitor base coming back, but also bring in new visitors. You want the search engines to see your Web presence is growing, not shrinking. Web sites with 100 pages re-launched with 500 pages do better because of the “tail” traffic (those additional 400 pages are getting direct traffic), and the Web site seems to get a bump in “authority” because the entire Web site has grown.
  • Avoid:
    • Rehashing (or even copying) existing content that will bring little extra value to users
    • Having duplicate or near-duplicate versions of your content across your site

Offer exclusive content or services

Consider creating a new, useful service that no other site offers. You could also write an original piece of research, break an exciting news story, or leverage your unique user base. Other sites may lack the resources or expertise to do these things.

Create content primarily for your users, not search engines

Designing your site around your visitors’ needs while making sure your site is easily accessible to search engines usually produces positive results.
  • Avoid: Having blocks of text like “frequent misspellings used to reach this page” that add little value for users
  • Never:
    • Insert numerous unnecessary keywords aimed at search engines but are annoying or nonsensical to users
    • Deceptively hide text from users, but display it to search engines

Write great articles about your area of expertise

By doing this you become a major source of news and information in your area of expertise.
  • Avoid: Burdening these articles with sales pitches for your products or services

Develop your content close to the top of the page (output HTML)

Having an <h1> tag close or right next to the <body> tag increases its importance with search engines and therefore weighs that header title heavier. Also, having your content follow closely after that <h1> has the similar effect. It is suggested that developing your pages framework/structure to have its content just following the <body> tag (at the top) followed by lesser important elements will increase the likelihood that the page will rank higher.

Include your keywords/key phrases in your content

Using your keywords/key phrases within your content will show greater importance in spidering.

SEO Best Practices Part 6

Headings <h1>, <h2>, <h3>, <h4>

Write an outline

Similar to writing an outline for a large paper put some thought into what the main points and sub-points of the content on the page will be and decide where to use heading tags appropriately.
  • Avoid:
    • Placing text in heading tags that wouldn’t be helpful in defining the structure of the page
    • Using heading tags where other tags like and may be more appropriate
    • Erratically moving from one heading tag size to another

Use headings sparingly across the page

Use heading tags where it makes sense. Too many heading tags on a page can make it hard for users to scan the content and determine where one topic ends and another begins.
  • Avoid:
    • Excessively using heading tags throughout the page
    • Putting all of the page’s text into a heading tag
    • Using heading tags only for styling text and not presenting structure

SEO Best Practices Part 5

Site navigation

A sitemap (lower-case) is a simple page on your site that displays the structure of your website, and usually consists of a hierarchical listing of the pages on your site. Visitors may visit this page if they are having problems finding pages on your site. While search engines will also visit this page, getting good crawl coverage of the pages on your site, it’s mainly aimed at human visitors.
An XML Sitemap (upper-case) file, which you can submit through Google’s Webmaster Tools, makes it easier for search engines to discover the pages on your site. Using a Sitemap file is also one way (though not guaranteed) to tell Google which version of a URL you’d prefer as the canonical one (e.g. http://leroysstampcollecting.com / or http://www.leroysstampcollecting.com).

Naturally flowing hierarchy

Make it as easy as possible for users to go from general content to the more specific content they want on your site. Add navigation pages when it makes sense and effectively work these into your internal link structure.
  • Avoid:
    • Creating complex navigation links, e.g. linking every page on your site to every other page
    • Going overboard with cutting up your content (it takes twenty clicks to get to deep content)

Text for navigation

Controlling most of the navigation from page to page on your site through text links makes it easier for search engines to crawl and understand your site. Many users also prefer this over other approaches, especially on some devices that might not handle Flash or JavaScript.
  • Avoid: Having a navigation based entirely on drop-down menus, images, or animations (many, but not all, search engines can discover such links on a site, but if a user can reach all pages on a site via normal text links, this will improve the accessibility of your site)

Breadcrumb navigation

A breadcrumb is a row of internal links at the top or bottom of the page that allows visitors to quickly navigate back to a previous section or the root page. Many breadcrumbs have the most general page (usually the root page) as the first, left-most link and list the more specific sections out to the right.
  • Example: Leroy’s Stamp Collecting > Articles > Top Ten Rarest Stamps

HTML sitemap and XML Sitemap

A simple sitemap page with links to all of the pages or the most important pages (if you have hundreds or thousands) on your site can be useful. Creating an XML Sitemap file for your site helps ensure that search engines discover the pages on your site.
  • Avoid:
    • Letting your HTML sitemap page become out of date with broken links
    • Creating an HTML sitemap that simply lists pages without organizing them, for example by subject

Consider what happens when a user removes part of your URL

Some users might navigate your site in odd ways, and you should anticipate this. For example, instead of using the breadcrumb links on the page, a user might drop off a part of the URL in the hopes of finding more general content. He or she might be visiting http://www.leroysstampcollecting.com/news/2008/upcoming-stamp-collecting-shows.htm, but then enter http://www.leroysstampcollecting.com/news/2008/ into the browser’s address bar, believing that this will show all news from 2008. Is your site prepared to show content in this situation or will it give the user a 404 (“page not found” error)? What about moving up a directory level to http://www.leroysstampcollecting.com/news/?

Have a useful 404

Users will occasionally come to a page that doesn’t exist on your site, either by following a broken link or typing in the wrong URL. Having a custom 404 page that kindly guides users back to a working page on your site can greatly improve a user’s experience. Your 404 page should probably have a link back to your root page and could also provide links to popular or related content on your site. Google provides a 404 widget that you can embed in your 404 page to automatically populate it with many useful features. You can also use Google Webmaster Tools to find the sources of URLs causing “not found” errors.
  • Avoid:
    • Allowing your 404 pages to be indexed in search engines (make sure that your web server is configured to give a 404 HTTP status code when non-existent pages are requested)
    • Providing only a vague message like “Not found”, “404″, or no 404 page at all
    • Using a design for your 404 pages that isn’t consistent with the rest of your site

Popup windows

A splash page is a main entry page that displays either a large graphic image or a Flash animation, usually with a link to Enter a web site or Skip Intro (skip the animated introduction page). Splash pages usually redirect to a new web page after the animation has completed. As you might expect, splash pages typically lack keyword- or key phrase-rich content, as they contain little or no visible body text other than Enter or Skip Intro links. Given little or no text content, the search crawlers have nothing to index.
Typically, splash pages use redirects to automatically advance the user to the web site’s actual homepage. Currently, search engines tend not to index web sites that use redirects, and they’ll ban web sites that create artificial redirects in an attempt to achieve higher rankings. So, by using a splash page that contains little or no text content, and uses redirects, you’ll likely have ruined your chances of having your web site indexed – let alone ranked – by search engines.

SEO Best Practices Part 4

URL Structuring

Creating descriptive categories and filenames for the documents on your website can not only help you keep your site better organized, but it could also lead to better crawling of your documents by search engines. Also, it can create easier, “friendlier” URLs for those that want to link to your content.
URLs like these can be confusing and unfriendly. Users would have a hard time reciting the URL from memory or creating a link to it. Also, users may believe that a portion of the URL is unnecessary, especially if the URL shows many unrecognizable parameters. They might leave off a part, breaking the link.

Use words in URLs

URLs with words that are relevant to your site’s content and structure are friendlier for visitors navigating your site. Visitors remember them better and might be more willing to link to them.
  • Do use words in your URL structure or pathing:
    • Right: http://www.leroysstampcollecting.com/vintage/rare/mickeymouse.htm
    • Wrong: http://www. leroysstampcollecting.com/folder1/1089257/x1/0000023a.htm
  • Avoid:
    • Using lengthy URLs with unnecessary parameters and session IDs
    • Choosing generic page names like “page1.html”
    • Using excessive keywords like “stamp-collecting-stamps-collect-stampcollecting.htm”

Dynamic pages

If you decide to use dynamic pages (i.e., the URL contains a “?” character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few.

Simple Directory structure

Use a directory structure that organizes your content well and is easy for visitors to know where they are at on your site. Try using your directory structure to indicate the type of content found at that URL.
  • Example: “…/vintage/rare/mickeymouse.html”
  • Avoid:
    • Having deep nesting of subdirectories like “…/dir1/dir2/dir3/dir4/dir5/dir6/page.html”
    • Using directory names that have no relation to the content in them

One version of a URL to reach a document

To prevent users from linking to one version of a URL and others linking to a different version (this could split the reputation of that content between the URLs), focus on using and referring to one URL in the structure and internal linking of your pages. If you do find that people are accessing the same content through multiple URLs, setting up a 301 redirect from non-preferred URLs to the dominant URL is a good solution for this.
  • Avoid:
    • Having pages from sub domains and the root directory (e.g. “domain.com/ page.htm” and “sub.domain.com/page.htm”) access the same content
    • Using odd capitalization of URLs (many users expect lower-case URLs and remember them better)
  • Never: Mix www. and non-www. versions of URLs in your internal linking structure

SEO Best Practices Part 3

Use of robot.txt file

A “robots.txt” file tells search engines whether they can access and therefore crawl parts of your site. This file, which must be named “robots.txt”, is placed in the root directory of your site. It’s important to have a robots.txt file present in your root directory because some search spiders will not crawl a site if they don’t find the robots.txt file.

Files and Directories

Be sure to declare which files/directories you don’t want the robots to crawl. Most bots will recognize most commands. Visit Google’s Webmaster tools for declaration lists.

Session IDs

Allow search bots to crawl your sites without session IDs or arguments that track their path through the site. These techniques are useful for tracking individual user behavior, but the access pattern of bots is entirely different. Using these techniques may result in incomplete indexing of your site, as bots may not be able to eliminate URLs that look different but actually point to the same page.

SEO Best Practices Part 2

Meta Tags

Description <meta name=”description” />

A page’s “description” meta tag gives search engines a summary of what the page is about. Whereas a page’s title may be a few words or a phrase, a page’s description meta tag might be a sentence or two or a short paragraph. Google’s Content Analysis Section will tell you about any description meta tags that are too short, long, or duplicated too many times.

Accurately summarize the page’s content

Write a description that would both inform and interest users if they saw your description meta tag as a snippet in a search result. Be sure to include your keywords and key phrases as well.
  • Example: Rare vintage stamps from around the United States that date all the way back to the early 1800s.
  • Avoid:
    • Writing a description meta tag that has no relation to the content on the page
    • Using generic descriptions like “This is a webpage” or “Page about stamp collecting”
    • Filling the description with only keywords
  • Never: Copy and paste the entire content of the document into the description meta tag

Unique descriptions for each page

Having a different description meta tag for each page helps both users and search engines, especially in searches where users may bring up multiple pages on your domain. If your site has thousands or even millions of pages, hand-crafting description meta tags probably isn’t feasible. In this case, you could automatically generate description meta tags based on each page’s content.
  • Example:
    • 1st Page – Rare vintage stamps from around the United States that date all the way back to the early 1800s.
    • 2nd Page – Misprinted stamps that were never meant to make it to the general public.
  • Avoid: Using a single description meta tag across all of your site’s pages or a large group of pages

Keywords

These are words that describe the content of that page separated by commas. Although it has been speculated that Google and Live don’t use the meta keywords, and Google didn’t provide documentation on best practices for this, Yahoo, Ask, and others do. Using them definitely doesn’t hurt your chances for higher page rankings. Also, it is important to understand that each page’s content will most likely be different from the next and will probably have different keywords that will need to be associated with it.

Accurately summarize the page’s “key” words

Write keywords that are relevant to your site content. Also, use plural versions of your keywords when appropriate, you will get hits from people looking for singular and plural versions of the word.
  • Example: “stamps, collecting, rare, vintage”
  • Avoid: Using words that don’t appear in the page content at all. If you have the keywords “stamp collecting” in a meta tag but do not have the words in the content of your page, it is very unlikely that those keywords will help the page do well for ranking.

Keyword popularity with search engines

Are targeted keywords popular phrases used in search engines? A simple keyword search in a search engine will tell you how popular they are. You could also dig through your referrer logs or Web analytics data to determine which search engines send your site considerable traffic for which phrases.

Keywords used by competitors

Check your competitors to see what keyword usage they have. Sometimes they have already done the keyword hunt and you can cross-reference your words with theirs.

Key phrases vs. keywords

Single keywords are much more commonly entered by web users, which make them more difficult to target effectively than multi-word key phrases. Unless the single keywords are highly unique, your best results will be achieved using key phrases. Consider using key phrases when applicable.
  • Example: Try using key phrases like “vintage stamps” versus “stamps”.
  • Avoid: Creating repeated keywords in attempts to make key phrases “new, stamps, vintage, stamps, stamp, collecting” as it will have negative effects in ranking as you duplicated the word “stamp” three times.

Special timing for keywords/key phrases

Depending on your industry, you might consider creating a calendar of keywords and key phrases for future reference. In some industries, the popularity of given keywords can fluctuate throughout the year. Site owners operating in such industries might find it worthwhile to keep a calendar that lists which keywords and key phrases are popular at different times of year.

SEO Best Practices Part 1

Search engine optimization, or SEO as it is most commonly referred to by, is by far one of the most important things involved with a web sites design, and more importantly, its content. I have spent some time compiling my research and experience into what best practices should be followed with SEO, as well as several examples of what should and shouldn’t be done.

Page Title

A title tag tells both users and search engines what the topic of a particular page is. Ideally, you should create a unique title for each page on your site. Creating keyword/key phrase-rich text for your page title is critically important, because nearly all search engines give the title element’s text a lot of weight. Create each page’s title to reflect the specific content of that page, using key phrases that people might type into search engines to find your web site.

Accurately describe the page’s content

Choose a title that effectively communicates the topic of the page’s content.
  • Example: Rare Vintage Stamps
  • Avoid: Choosing a title that has no relation to the content on the page
  • Never: Use default or vague titles like “Untitled” or “New Page 1″

Create unique title tags for each page

Each of your pages should ideally have a unique title tag, which helps Google know how the page is distinct from the others on your site.
  • Example:
    • 1st Page – Rare Vintage Stamps
    • 2nd Page – Misprinted Stamps
  • Avoid: Choosing a title that has no relation to the content on the page

Use brief, but descriptive titles

Titles can be both short and informative. If the title is too long, Google will show only a portion of it in the search result.
  • Avoid:
    • Using extremely lengthy titles that are unhelpful to users
    • Stuffing unneeded keywords in your title tags

Display page description first then company/site name second

Because search engines only show so many characters in the page title (first 30 characters or so), it is important to let your user know what the page is about more so than your long company/site name. If the company/site name is first and happens to be long, you could be missing out on potential traffic due to your keywords/key phrases not being spidered because they are beyond the threshold of what can be displayed in the search results.
  • Example: Rare Vintage Stamps | Leroy’s Stamp Collecting Site
  • Never: Leroy’s Stamp Collecting Site | Rare Vintage Stamps

Google Web Fonts

Last May, Google introduced the Google Font Directory, a collection of high quality open source web fonts, and the Google Font API to make them available to everybody on the web. They had an initial rollout of 18 fonts and are frequently adding to that library.
As a web developer and typography nerd, this is an exciting service that Google is providing. If you’ve used Google Docs, you already have some experience with the use of the fonts.



No longer will sites need to be confined to the likes of Arial, Times New Roman, Georgia, et al. It should be noted that there’s nothing wrong with these fonts (except for Comic Sans). This service offers a library of wide-ranging types of fonts for designers to create without as many restrictions.
What browsers are supported?
The Google Web Fonts API is compatible with the following browsers:
  • Google Chrome: version 4.249.4+
  • Mozilla Firefox: version: 3.5+
  • Apple Safari: version 3.1+
  • Opera: version 10.5+
  • Microsoft Internet Explorer: version 6+
Does the Google Web Fonts API work on mobile devices?
 - The Google Web Fonts API works reliably on the vast majority of modern mobile operating systems, including Android 2.2+ and iOS 4.2+ (iPhone, iPad, iPod). Support for earlier iOS versions is limited.

Implementation

It is easy to use a Google web font within your site.
  • Visit Google Web Fonts
  • Click on the font you’d like to use
  • Click the “Use this font” tab
  • Highlight and copy the LINK code provided
<link href='http://fonts.googleapis.com/css?family=Pacifico' rel='stylesheet' type='text/css'>
  • Paste the HTML code into your site. (The instructions state to paste it into the HEAD code, but I was able to get it to work successfully from within the BODY tag.)
  • Lastly, apply the font through CSS
h1 { font-family: 'Pacifico', arial, serif; }


Url: http://www.google.com/webfonts

Sunday, December 18, 2011

IE to Start Automatic Upgrades across Windows XP, Windows Vista, and Windows 7

Everyone benefits from an up-to-date browser.
Today we are sharing our plan to automatically upgrade Windows customers to the latest version of Internet Explorer available for their PC. This is an important step in helping to move the Web forward. We will start in January for customers in Australia and Brazil who have turned on automatic updating via Windows Update. Similar to our release of IE9 earlier this year, we will take a measured approach, scaling up over time.



As always, when upgrading from one version of Internet Explorer to the next through Windows Update, the user’s home page, search provider, and default browser remains unchanged.

Full article : http://windowsteamblog.com/ie/b/ie/archive/2011/12/15/ie-to-start-automatic-upgrades-across-windows-xp-windows-vista-and-windows-7.aspx

Thursday, December 15, 2011

Different between utf-8 and utf-8 without BOM?

The UTF-8 BOM is a sequence of bytes (EF BB BF) that allows the reader to identify the file as an UTF-8 file.
Normally, the BOM is used to signal the endianness of the encoding, but since UTF-8 doesn't have any encoding issue, the BOM is unnecessary.
According to the Unicode standard, the BOM for UTF-8 files is not recommended:

2.6 Encoding Schemes

Use of a BOM is neither required nor recommended for UTF-8, but may be encounter in contexts where UTF-8 data is converted from other encoding forms that use a BOM or where the BOM is used as a UTF-8 signature. See the "Byte Order Mark" subsection in Section 16.8, Specials, for more information.

How to create slick effects with CSS3 box-shadow

Drop shadows and inner shadows are some of the effects I learned to apply using Photoshop’s Blending options. But now, since CSS3 “hit the charts”, you don’t need Adobe’s design tool to add a drop shadow or an inner shadow to a box.
Nowadays, the cool thing is that you create beautiful CSS3 shadows without actually needing Photoshop anymore.


box-shadow property

The box-shadow property allows you to add multiple shadows (outer or inner) on box elements. To do that you must specify values as: color, size, blur and offset.
<shadow> = inset? && [ <length>{2,4} && <color>? ]

Rocket science?

Not at all, here’s an quick example:
box-shadow: 3px 3px 10px 5px #000;
This CSS declaration will generate the following shadow:


  • A positive value for the horizontal offset draws a shadow that is offset to the right of the box, a negative
    length to the left.
  • The second length is the vertical offset. A positive value for the vertical offset basically offsets the
    shadow down, a negative one up.
  • You’re not allowed to use negative values for blur radius. The larger
    the value, the more the shadow’s edge is blurred, as it can be seen above.
  • Spread distance positive values cause the
    shadow shape to expand in all directions by the specified radius.
    Negative ones cause the shadow shape to contract.
  • The color is the color of the shadow.
  • The inset keyword (missing above), if present,
    changes the drop shadow from an outer shadow to an inner
    shadow
The above theory it’s just a small amount, if you want to read more, than be my guest and check the W3C specs.

Enough theory, let’s see some stuff!

Now let’s see how can you take advantage of this wonderful CSS3 feature. Below I’ll show you how to enhance your designs with the coolest box-shadow techniques!

Add depth to your body

 

Reference URL


 
body:before{
   content: "";
   position: fixed;
   top: -10px;
   left: 0;
   width: 100%;
   height: 10px;
   z-index: 100;
   -webkit-box-shadow: 0px 0px 10px rgba(0,0,0,.8);
   -moz-box-shadow: 0px 0px 10px rgba(0,0,0,.8);
   box-shadow: 0px 0px 10px rgba(0,0,0,.8);
   }

 

Drop shadows



Here are the articles that inspired me, and not only:
#box
{
  position: relative;
  width: 60%;
  background: #ddd;
  -moz-border-radius: 4px;
  border-radius: 4px;
  padding: 2em 1.5em;
  color: rgba(0,0,0, .8);
  text-shadow: 0 1px 0 #fff;
  line-height: 1.5;
  margin: 60px auto;
}
#box:before, #box:after
{
  z-index: -1;
  position: absolute;
  content: "";
  bottom: 15px;
  left: 10px;
  width: 50%;
  top: 80%;
  max-width:300px;
  background: rgba(0, 0, 0, 0.7);
  -webkit-box-shadow: 0 15px 10px rgba(0,0,0, 0.7);
  -moz-box-shadow: 0 15px 10px rgba(0, 0, 0, 0.7);
  box-shadow: 0 15px 10px rgba(0, 0, 0, 0.7);
  -webkit-transform: rotate(-3deg);
  -moz-transform: rotate(-3deg);
  -o-transform: rotate(-3deg);
  -ms-transform: rotate(-3deg);
  transform: rotate(-3deg);
}
#box:after
{
  -webkit-transform: rotate(3deg);
  -moz-transform: rotate(3deg);
  -o-transform: rotate(3deg);
  -ms-transform: rotate(3deg);
  transform: rotate(3deg);
  right: 10px;
  left: auto;
}

Quick tips

Try spicing up shadows with RGBa color. The box-shadow property can be used using CSS3 RGBa colors to create shadows with differing levels of opacity. If your browsers supports the box-shadow property, then it will definitively support the RGBa color mode.
Use multiple shadows in one CSS declaration:
  box-shadow: 3px 3px 10px 5px #000, 0 0 4px rgba(0, 0, 0, .5) inset;

Browser Support

  • Internet Explorer 9/10
  • Firefox (from 3.5)
  • Safari/Chrome
  • Opera (from 10.5)

Php Date format examples

echo date("D - m/d/y"); // Wed - 12/14/11

echo date("l - \T\h\e jS \d\a\y \of F"); // Wednesday - The 14th day of December

echo date('l \t\h\e jS'); // Wednesday the 15th

echo "July 1, 2000 is on a " . date("l", mktime(0, 0, 0, 7, 1, 2000)); // July 1, 2000 is on a Saturday

echo date('l jS \of F Y h:i:s A'); // Monday 8th of August 2005 03:12:46 PM

echo date("l"); // Monday

echo date(DATE_ATOM, mktime(0, 0, 0, 7, 1, 2000)); // 2000-07-01T00:00:00+00:00

echo date(DATE_RFC822); // Mon, 15 Aug 2005 15:12:46 UTC

echo date("F j, Y, g:i a"); // March 10, 2001, 5:16 pm

echo date("m.d.y"); // 03.10.01

echo date("j, n, Y"); // 10, 3, 2001

echo date("Ymd"); // 20010310

echo date('h-i-s, j-m-y, it is w Day'); // 05-16-18, 10-03-01, 1631 1618 6 Satpm01

echo date('\i\t \i\s \t\h\e jS \d\a\y.'); // it is the 10th day.

echo date("D M j G:i:s T Y"); // Sat Mar 10 17:16:18 MST 2001

echo date('H:m:s \m \i\s\ \m\o\n\t\h'); // 17:03:18 m is month

echo date("H:i:s"); // 17:16:18

echo date( "F jS, Y ",strtotime("2011-12-14 11:23:20")); // December 5th, 2011

Lorem ipsum for images



Link : http://lorempixel.com/

Wednesday, December 14, 2011

CSS3 Slideup Boxes

Follow along as we use a few very simple CSS3 transitions to create a "slideup" box effect. Roll over the box with your mouse, and the title of the box slides out of the way and a more descriptive stylized box of information jockeys its way into place. This is supported in modern version of Gecko, WebKit, and Opera browsers. No Internet Explorer support yet, but no fallback is needed, the link works and the information shows as expected.




View Demo
Download Files

Css3 Facebook buttons




Link: http://nicolasgallagher.com/lab/css3-facebook-buttons/

Css3 BonBon Buttons

Create CSS buttons that are sexy looking, really flexible, but with the most minimalistic markup as possible.
 
And voila.. here they are, the BonBon Buttons. Named after the French word for "Candy". So, let's take a tour trough the candy store.


Link: http://lab.simurai.com/css/buttons/

CSS Exclusions and Shapes Module Level 3


The W3C just fulfilled a CSS fantasy of mine from 2007 with the announcement of CSS Exclusions and Shapes:

- The exclusions section of this specification defines features that allow inline flow content to wrap around outside the exclusion area of elements. The shapes section of the specification defines properties to control the geometry of an element's exclusion area as well as the geometry used for wrapping an element's inline flow content.


Documentation : http://www.w3.org/TR/css3-exclusions/