4 COMMON TIPS FOR IMPROVING WEBSITE SPEED AND WHY THEY ARE WRONG
Stop us if you have heard this before - fast websites convert more than slow websites. By now it should be obvious that your website performance - that is the actual loading times of your website, the size of images, the weight of the page - is something that should not be ignored. In fact, speed, especially in the form of fast websites that enable great experiences, can be a competitive advantage.
The hard part in this simple ‘speed=money’ formula is actually doing the work to make your site faster. Sure, there are always low hanging fruit, but any competent developer knows that loading multiple megabyte or larger images is going to take a while.
Here are 4 common strategies for improving website speed - and why they are misguided.
Optimize your images:
Images make up over 50% of the average web page, and are often the longest loading element on a page - we all know this. Shoppers love having many product photos to help evaluate their purchase decision, and readers love having supporting graphics and photos. We also know that images and conversion rates are linked, or rather the lack of high-quality images is heavily correlated with low conversion rates and high return rates for online shoppers. It should be obvious at this point that work should be spent to reduce the weight of images on your site.
However, too often we see brands fix their image issues by either over-compressing images, or removing them all together from pages. As we mentioned before, this leads to a poor experience for shoppers, even though certain metrics like page load times, page weight and bandwidth usage will all improve. The takeaway here is to find the happy medium between big, beautiful, slow images and tiny, pixelated, fast images.
Either sending massively oversized images to customers - think multiple Megabytes or more, or massively over compressing photos and reducing quality to the point of them being useless.
Having creative teams manually apply compression and transcode all images before they are uploaded to the origin server, or blanket-apply a certain compression level to each page.
Use AI and ML to classify each image on the page, and then apply optimal compression and transcode each individual image on the page to the correct format. Saves creatives and developers time, drastically reduces bandwidth needs and ensures fast, captivating websites.
Manage 3rd party tags:
Make no mistake - 3rd party services like analytics, personalization, retargeting and communication tools have completely transformed marketing and the way companies do business on the internet. However, tools like Google Tag Manager, Segment, Tealium, and other tag managers have made it extremely easy for the ‘tag party’ to quickly spiral out of control. We routinely talk with Fortune 500 companies that have over 70 cloud services on their website, and sometimes north of 100.
Although all of these services claim to ‘not degrade web performance’ it is impossible for all 70 or 100 services to load asynchronously and not cause some sort of delay. We recommend implementing a ‘performance budget’ and putting someone in charge of monitoring and enforcing cloud services on your site. If you set a budget of 800ms or 2500ms for cloud services to load, you can easily keep a handle on things. This also helps your organization when new tools are being evaluated - it can form easy decisions making criteria.
Wild west - no oversight of what services get installed. All tags loading synchronously - tags distributed across header, body, footer and referenced before and after blocking events like DOM complete or onload
Use a tag manager to load all tags, ensure all tags are asynchronously loaded. Do your best to identify certain tags to lazy load or defer.
Put someone in charge of monitoring all 3rd party tags on the website and implement a ‘performance budget’ to ensure new tags won’t degrade site performance. Use an AI service that can detect and defer tags that aren't necessary at page load, and prioritize the loading of tags like ads or analytics that are critical to load first.
Use all the libraries.
Manual optimization of libraries during development like only including specific utilities, choosing Inferno or Preact over React, and generally reducing bundle size as much as possible.
Implement a service that uses AI and ML to monitor what calls are actually being consumed by real website visitors, and then automatically create new optimized JS files that include only those needed calls. In the rare event a visitor needs a rare JS function, the service can stream that specific function down to the visitor.
Use a CDN:
Every single website speed article you can put your eyes on will tell you to implement a CDN. The speed benefit of a CDN is real, make no mistake, but most CDNs on the market are based on 20 year old technology. Yes, you should locate content as close as reasonably possible to users, but the core problems of the internet 20 years ago are no longer the core problems of today. The T3 45mbps backbone of the 90s and the 2.5gbps backbone of the 2000’s have been replaced by 100gbps connections today.
Every major ISP has their own major connections, and almost every major population center in the world is located near a major peering center. The real problem of today is the last mile - rich, immersive websites being delivered across wireless and wifi connections, to 1000s of different types of devices. CDNs today need to be architected for a mobile world, built to deliver modern content, and have automatic optimizations to account for the immense variety of connections and devices.
Serve all of your content directly to your users from your origin server.
Use a standard CDN to cache static content on the edge in as many POPs as possible.
Use a next-generation CDN that was built and optimized for the modern internet - specifically for mobile devices and connections. Cache static and dynamic content, and dramatically reduce page load times for all visitors.