Wednesday, March 31, 2010

What Places Hire At 15 In Brampton

For Emma Punk - Electric Version

Sunday, March 28, 2010

Stomach Girdles For Women Wedding Dress

For Emma Punk. .. (The Golden Lemons - Forever punk)

Wednesday, March 24, 2010

Preventing Creases In Uggs

Web Performance Optimization

performance optimization

With a customer has been optimized the performance of the website. This optimization was inspired by the book "High Performance Web Sites: 14 Steps to Faster-Loading Web Sites by Steve Souders.

Based on the engineering Tak, which are described in this book, the following rules were drawn up and implemented.

rendering as early as possible allow

When a user views a page, he must wait until the page is built. While he waits, he needed a visual feedback so he can be sure that the page works. With the feedback appears to the user the time that he is waiting for the page subjectively shorter.

will be displayed in Web pages is usually not a progress bar, but the site itself is the progress bar. Instead of the user is seated before a white page that builds up after the waiting time in a single moment in which the page should build up in the browser soon.

To achieve this, it should be possible to the Web browser technology, down display content loaded as early as possible can. This it needs just as soon as possible all necessary information for rendering the page. You should also

- flush the HTTP output - for example, after the rendering of the head region.

A white page appears (to be minimized for example, and put back) (especially in Internet Explorer), even if

  • you open a page in a new window
  • the browser is moved during reloads,
  • a page as the homepage is (as the first page at all) loaded.
worse than a white page is a flash of Unstyled Content (FUC). This is caused when style sheet information is used only when the browser has already started with the presentation of content.

dimensions of graphics and images set

If one for images (pictures and graphics) specifies the height and width already in the image tag, the browser can start by calculating the side even before the actual binaries downloaded the images were.

incorporate CSS in the header

style sheets that are already included in the document head (HEAD), allow the browser as soon as possible to begin the rendering. In addition, by including in the document head the FUC avoided.

CSS should always be included with the link tag. @ Import rules in CSS that is only later, when the CSS has already been downloaded and evaluated.

CSS embed JavaScript before

browser blocking JavaScripts to load all CSS files. This is because that JavaScript could use information from style attributes. So the browser should wait before executing JavaScript, and CSS files are loaded. So should it take to avoid blocking, CSS files referenced in the HTML code from JavaScript files.

Reduce the number of HTTP requests

The less individual Download resources (files), a browser, the quicker they can be downloaded, because of the Over-Head eliminated through HTTP or through TCP round-trips. Also applies: the smaller are these resources are, the faster they are transmitted.

parallelize downloads on different hosts

a browser can open per host only a certain number of connections. A survey to determine the browser how many connections can open at the same time, the site offers browserscope.org . The Internet Explorer 6 (IE6) for example, can open only two connections per host.

When the resources of a site on multiple hosts distributed, then the browser can use a larger number of connections.

For static content was therefore created two (virtual) Static-hosts on this website are delivered.

Reduce DNS Lookups

can however not be used any number of host names. Per host name that is a DNS lookup is necessary. Firefox stores DNS lookups only for a minute between. That is, if one uses too many host for the parallel download, the speed advantage that is achieved by different hosts, is deleted by the DNS lookup again.

So it should be used no more than 2-4 per site hosts for parallel be. The integration of resources of many hosts, such as web beacons when they (and mash-ups) are common, should be avoided.

Reduce SSL handshake

It is reviewed, in each case whether at a page that is delivered over HTTPS, it is generally worthwhile to use different hosts in parallel. Possibly. the time, obtained by parallel downloads, consumed by an additional SSL handshake again.

Use a CDN

Content Delivery Networks offers a network of servers that are hosted at various points of view. Through this network can be resources such as videos or large Downloads distributed spatially. Users who request a resource from the CDN, this will automatically be delivered physically via a nearby server.

large files (such as trailers and other flash-movies) are delivered over a CDN. Flash movies must always be organized so that they can be delivered from any URL. This is not because of the use of ActionScript and the Same Origin Policy selbstvertändlich.

Use a cookie-free hosting static content

cookies for each request, the cookie domain from browser to server. The bandwidth from the browser to the server is often much smaller than the bandwidth from server to browser. The transfer of these cookies is useless for many content overhead. Content that is static, so no session information, etc. required from cookies, should therefore be delivered by a second cookie-free domain. Examples of such content are style sheets, graphics and flash movies that can not be shipped via the CDN (see above).

A URI-builders in a CMS that can distinguish between static and dynamic content that delivers dynamic content without a host name, static content, however, a host configured for static content.

Optimize Style Sheets

light allows HTTP requests reduce by summarizing CSS files to a few or a single resource.

superfluous white spaces can be removed in the CSS files via the YUI Compressor. This compression should be automated in the build or in the publication process.

Since CSS Expressions and long CSS selectors have a negative impact on browser performance, they should be avoided.

background images in style sheets can be combined into CSS sprites. If it is on a graphic page 25 of the same size, they can be a sprite Map summarized with 5x5 elements. Instead of 25 HTTP requests then takes place only a single HTTP request. A good example of CSS sprites is the sprite-map that uses Google on the search page.

The bookmarklet to spriteme.org can automatically generate a sprite Maps page.

Optimize JavaScript

JavaScripts that are needed at different sites should not be used inline in the HTML document but as a separate JavaScript resource. Thus, the size of each HTML document is reduced. JavaScripts that are used on only one side should be inline in the HTML document in order to reduce the number of requests.

Many browsers load JavaScript - in contrast to other resources such as images or style sheets - not parallel but work a JavaScript document after the other. The download of JavaScript is blocked so the parallel download of resources. That is why make improvements in JavaScript libraries particularly noticeable.

Similar to CSS files can also combine a few JavaScript libraries to JavaScript resources.
This also has the advantage that dependencies between libraries be made explicit. be involved if a library like jQuery UI of the library depends on jQuery, then jQuery library in the only resource from jQuery UI. The pooling of JavaScript libraries into a single JavaScript resource should be in the build process carried out by an assembler. By the use of an assembler in the build process can also arise sure that scripts are included twice.

The size of JavaScript documents may be easily be reduced by the Minify. By Minify, comments and unnecessary whitespace from JavaScript removed. Conventional compressors are Google Closure, JSMin by Douglas Crockford from Yahoo or YUI-Compress. Also Minify of JavaScript should be automated in the build process. Alternatively, and at run-time JavaScript code to compress the YUI compressor. That's where Google's HTML Compressor ( htmlcompressor.googlecode.com ) in conjunction use the YUI Compressor.

JavaScripts that are not needed before rendering the page, can be easily packed into the foot of the document. Thus, visible elements from the JavaScript resources loaded and the page is already shown as JavaScript libraries, which are integrated in the bottom of the page, reload yet.

JavaScript should be created so that they do not write on document.write () directly into the HTML stream but manipulate the DOM after the page loads. This prevents the browser from blocking the rendering or even perform unnecessarily a re-rendering needs.

Use the Expires header

load when you first visit a page, the browser must download all the content. This content can be stored in the browser cache between. Whether and how long the content is stored is controlled via HTTP header as the Expires header: Expires: Thu, 14 Oct 2010 09:01:30 GMT

for static content is to focus on this example, a year into the future . So the browser is caused to store documents as long as possible between. You can also set the header into the future, although it is not recommended in the RFC.

The Expires header has the disadvantage that he expects a date. Actually, so would the time between server and browser are synchronized. Instead of the Expires header was introduced in HTTP 1.1 so the max-age directive of the Cache-Control header that takes a value in seconds: Expires: Thu, 14 Oct 2010 09:01:30 GMT Cache-Control: max-age = 31536000

Although the Cache-Control header should replace the Expires header, it is advisable to set both headers consistent. This can take over the "mod_expires" module of the Apaches. If, however

static content that will follow a year in the cache of the browser is changed on the server so the user gets this no longer.

must therefore also for the selection will be ensured of Expires headers that the changed content, a new (unique) URL get. A URI-Builder integrates about the release date of a resource in the URL of the resource: content/static/5940026/2009-10-12-11-52-39/thumbnail.gif

When a new version of the resource is released, then in the HTML document automatically generates a new URL. The browser then fetches the resource from the cache but no longer fetches the new version of the resource.

Instead of a unique URL could also use the / the ETAG. A floor can be thought of as a checksum on the content. The server transmits the checksum (the ETAG), together with the content in the HTTP header. When the browser needs the resource again, it transmits the GET request the floor of the resource with which it finds in its cache. The server can then, if the floor has not changed, respond with an HTTP 304 response (Not Modified).

Compared with a unique URL of the ETAG has two disadvantages. ETAGs can be a composite of an Apache (load balancing) is difficult to configure and ETAGs clear need for each requested resource, an HTTP round trip.

why Expires and Cache-Control headers are preferable to the ETAG.

ETAGs that sets the Tomcat should be removed from the Apache!

Compress resources

textual resources (HTML, JavaScript and CSS) can be transmitted compressed. Already compressed resources such as graphics and PDFs should not be additionally compressed again. The most common compression is gzip compression. Browsers that support compressed transfer shall notify the GET request to the server with:

Accept-Encoding: gzip, deflate

If the server then delivers compressed the resource, it responds with the response header: Content-

Encoding: gzip

turns on the Apache to GZIP the module mod_gzip (Apache 1.3) or mod_deflate (Apache 2.x).

Though Internet Explorer 6 officially support compression, it comes in from time to errors, eg if on the same computer an old version of RealPlayer installed. To be on the safe side, one should question the compression only popular browser that is newer than Internet Explorer 6 are capitalized.

interpret Unfortunately, some proxies, for example, are used in companies, the Content-Encoding header wrong. This can result in a browser that supports GZIP, the proxy gets delivered non-zipped resources, or worse, that a browser that does not support compression, gets compressed by the proxy website.

would therefore be the Vary header is set accordingly, so that a proxy depending on the encoding and browsers (user agents) different resources for different encodings and browser (User Agent) caches:

Vary: Accept-Encoding, User-Agent

land Unfortunately so many artifacts in the cache, so the Vary header was configured as described above. It is therefore unacceptable that bad proxies and IE6 versions receive a non-optimal viewing of this site.

before zipping the HTML content that can be Minify about the code compressor. This allows unnecessary HTML comments, whitespaces, etc.:

\u0026lt;% @ taglib uri = " http://htmlcompressor.googlecode.com/taglib/compressor "Prefix =" compress "%>
\u0026lt;compress:html enabled="true" compressJavaScript="true">
\u0026lt;- here is your JSP code ->
\u0026lt;/ compress: hmlt>

Enable proxy caching

Although there may be problems with proxies (see above), was to caching by proxies are still motivated in general. This is achieved by setting the Cache-Control header to Public. Will notify you with not only the browser, but proxies explicitly that content may be cached.

Cache-Control: Some browsers let
Public
persuaded by this header addition, resources that are delivered over HTTPS, to save between, although usually resources, protected by a connection are delivered, not to be cached on the hard disk.