Monday, March 11, 2019

I saw Robert Boedigheimer speak on performance optimizations at the Twin Cities .NET User Group on Thursday night.

.dev is a top level domain that is now available. Robert Boedigheimer has his own .dev domain name and, yes, email address. Per Robert, eighty to ninety percent of the time that is spent waiting is spent waiting on resources and not the server interacting with databases or your C# code struggling to deal with string concatenation without you using a StringBuilder. (Oh man, every time you add something to a string you are really destroying and recreating the string because strings are immutable. Oh no!) Forget server-side concerns? Robert suggests it is your CSS and JavaScript that is making your site load slow, and, yes, slow load time is bad. Robert mentioned that Amazon and WalMart experimented with making their web presences load one hundred milliseconds slower and saw a one percent drop off in sales. I am not convinced that there is no reason to worry about server-side performance. This touched on that some and moreover, I have seen the pain of server-side performance problems firsthand at, well... I won't say. Alright, we can have problems like that on the client side too. Robert suggests the three big things you can do to improve performance are:

  1. make few HTTP requests
  2. send as little as possible
  3. send it as infrequently as possible

Eric Lawrence's Fiddler was acquired by Telerik in 2012 but is still available for free and still a good tool for seeing what is going on with your web traffic in the name of putting your finger on a problem. "Speed Index" as a metric is a number of milliseconds to when the viewport is mostly viewable while "Time to Interactive" is a comparable metric for how long it takes before a user may actually click stuff. webpagetest.org gives you waterfall diagrams (timelines) of where time is spent in bringing up your site. Keynote has become Dynatrace and you can pay this company a fee for "synthetic monitoring" in which they hit your site and report back to you what they thought of its snappiness. RUM (Real User Monitoring) is a way to recreate a user's steps through interacting with your site. As you use Chrome it's reporting back to Google (unless you forbid its telemetry data mining) and thus you may go to https://www.thinkwithgoogle.com/feature/testmysite to look up by URL the performance for any one site and cross-compare the performance of competitors to your own. Robert recommends checking the enable dynamic compression checkbox in IIS (it's a checkbox somewhere) to turn on HTTP compression which evaluates "Accept-Encoding" headers in requests and returns the result's header compressed by GZIP or Deflate or Brotli. Leverage the browser cache by allowing images to be cached at the client for thirty days as suggested by the server. To make IIS make such a suggestion, pick "Set Common Headers" under "HTTP Response Headers" and then set the setting for expiration as X number of days. If a resource is slow to be found, what is going on? Is it latency or is it bandwidth? It turns out that it is probably latency, the number of hops across routers one must journey to reach your server. Make your images "closer" to those requesting the images with a CDN (content delivery network). Limelight is a CDN. It is possible to have protocol-less links like so that will figure out for you if they resolve to http or https addresses.

<img src="//cdn.example.com">

 
 

You need to have fallbacks when utilizing a CDN. If jQuery is not loaded, pull the document from the server not the CDN, etc. With regards to bundling and minification, HTTP/2 is a gamechanger. The old paradigm of jamming all of your JavaScript into one file so that that one file equates to one request may be forgotten and you may now have multiple files which HTTP/2 will aggregate into a single request. This also means that when you change one of the small files the client does not have to get anew all of your JavaScript! Use SVG for logos. JPEGs are smaller than PNG files. Robert was not a fan of having images embedded in the HTML itself as a bunch of goofy gibberish that cannot be cached. There is a trick for having images of different sizes for responsive design too. Observe:

<img srcset="/images/flagBridge-128.jpg 128w, /images/flagBridge-256.jpg 256w,
      /images/flagBridge-512.jpg 512w" src="/images/flagBridge-default.jpg">

 
 

Obviously, the reason to indulge this is to load images of smaller file sizes when possible. Don't choke your mobile user with big images, etc. jpegtran is a command line tool that removes unnecessary metadata from JPEG images. The geolocation of where a photo was taken, what type of camera it was taken with, and what shutter speed was used are amongst the stupid things in almost every JPEG file that is adding to its bloat. Fiddler's Image Bloat will put images of red bricks ontop of images to partially or almost completely cover them to show just how much of a file's size is metadata versus the actual image. Robert showed off examples of images that were far more metadata than anything else. Abode Photoshop allows for batch compression of images. JPEG has two formats and the progressive format (as opposed to "normal") will show you a fuzzy image at first that gets clear as the image loads and this happens as opposed to the top loading first and then the image slowly revealing (loading) downwards. If you buy images and take out the metadata that is perfectly legal. data-src instead of src and data-srcset instead of srcset are ways to lazy load content after everything else is loaded so that you may then drag in the less vital content after the page is ready to go. You might have to drag in some JavaScript to get this to work. I'm not certain. If you use the async keyword like this:

<script src="/js/alert1.js" async></script>

 
 

The content in question will not block other downloading content while it tries to download. If you swap out async with defer in these circumstances the deferred scripts will still be aggregated in asynchronously without holding up the main content but will wait for each other a little bit and end up running in the specified order. Resource hints are newish. Here is a way to preload everything at a second web page that you bet users will visit next.

<link rel="prerender" href="http://www.example.com/">

 
 

URLs are seen in a case-sensitive light in most browsers such that variations in casing in three calls to the same URL can result in three independent trips outward for a resource. HTTP Strict-Transport-Security or HSTS lets you tell browsers what should only be available via https in lieu of http. Font Squirrel will allow you to take a font you are to use online and trim it down to just the characters you want reducing its size. Take out all of the wacky Korean characters you'll never use, etc. Lighthouse is an analytics tool as is https://developers.google.com/speed/pagespeed/insights/ It was recommended to do performance reviews with Fiddler and to use Corel PaintShop Pro as an inexpensive photo editor. If you are using fonts .woff2 files are smaller in file size than .woff files. webp is twenty percent smaller than JPEG but only supported by Google Chrome. Use the "Performance" tab in Google Chrome Developer Tools to record a movie for resources coming into play in a SPA app to find out where a bottleneck might be in your performance.

No comments:

Post a Comment