Feb 202017
 

Site speed can be said to be the number one issue facing web developers today.

Whether it’s this KISS Metrics block post, another KISS Metrics block post, study after study show that delivering your content fast, fast, fast is make-or-break factor in today’s web economony. That’s why it’s so important that your images are optimized for the web.

Photoshop and other tools export notoriously large files -- well over 1 MB. This is unacceptable in today’s world, where 33% of mobile users in the US are on 3G connections.

If you’re on Rails using Paperlcip, I’ve got a great solution to explore for you today: Image-Optim. You can automagically compress all your images, inside the Rails pipeline and also the ones you upload with Paperclip. On Heroku, you’ll need to use two special buildpacks to make this work. As well, because Heroku uses an ephemeral file system, Paperclip needs to be configured to use an AWS bucket as its storage.

First, refer to my blog post from last year, about how to add the ImageMagick buildpack to your Cedar-14 herokubuikd.

The instructions above will direct you to do add this buildpack first:

heroku buildpacks:add -i 1 https://github.com/jasonfb/heroku-buildpack-cedar14-imagemagick704

Then add another buildpack to your Heroku environment

heroku buildpacks:add -i 2 https://github.com/bobbus/image-optim-buildpack

(you’ll note here you are using the index flag to put this buildpack into position 2 because you already should have the Imaegmagick buildpack at position 1)

You should now have 3 buildpacks, which can be check with heroku buildpacks like so:

$ heroku buildpacks -a your-heroku-app
=== your-heroku-app Buildpack URLs
1. https://github.com/jasonfb/heroku-buildpack-cedar14-imagemagick704
2. https://github.com/bobbus/image-optim-buildpack
3. heroku/ruby

Then add to your Gemfile these 4 gems, (for the sake of this post I will assume you already have gem ‘paperclip’ in your Gemfile).

gem ‘paperclip-optimizer’
gem ‘image_optim’
gem ‘image_optim_rails’
gem ‘image_optim_pack’

To get this working on Heroku, you’ll actually need to work through a few more steps: database setup, AWS. For the lazy, check out the example which you can find at the end of the blog post.

Here’s my has_attached_file. In this example, I’m creating only two styles: a thumbnail, and an optimized version.

Notice that I’ve turned off lossless compression, in other words, allow_lossy: true

With this safeguard on (allow_lossy: false, which is default), I’m usually able to only get an image down to about 75% of its original size.

A large 909KB file was only reduced down to 730 KB; whereas Optimizilla was able to get it down to a whopping 189 KB.

With the safety guard switched off allow_lossy: true, I get much better results but much worse quality.

1st Example
Here, I define a thumb and a optimized.

has_attached_file :attachment, {
 styles: {
  :thumb => ‘125×100>’,
  :optimized => ‘%’
 },
 processors: [:thumbnail, :paperclip_optimizer],
 paperclip_optimizer: {
  nice: 19,
  jpegoptim: { strip: :all, max_quality: 10, allow_lossy: true },
  jpegrecompress: {quality: 1},
  jpegtran: {progressive: true},
  optipng: { level: 2 },
  pngout: { strategy: 1}
 },
 convert_options: { :all => ‘-auto-orient +profile “exif”‘ },
 s3_headers: { ‘Cache-Control’ => ‘max-age=31536000’}
}

2nd Example
Here, I define a thumb and a large.

Remember, when configured together the whole thing looks like this, see the “Per style setting” on this paperlclip-optimizer doc:

(this is an example that mimics the paperclip-optimizer docs)

 has_attached_file :avatar,
          processors: [:thumbnail, :paperclip_optimizer],
          paperclip_optimizer: {

          },
          styles: {
           thumb: { geometry: “100×100>” },
           large: {
            geometry: “%”,
            paperclip_optimizer: {
             paperclip_optimizer:
{
              jpegrecompress: { allow_lossy: true, quality: 4}},
              jpegoptim: { allow_lossy: true, strip: :all, max_quality: 75 }
            }
           }
          }

The Magic Sauce

The docs say you should have allow_lossy set to its default, which is is false. Using this setting this way means your images come out with no quality loss. In my tests, I’ve found that this setting should be turned on, overriding the default.

I recommend paying attention to two important settings
jpegoptim max_quality – 0 through 4, with 4 being best quality
jpegrecompress quality – 0 through 100%, with 100% being best quality

In my tests, I’ve found that the following are acceptable for production websites with high-quality images.

Option A
jpegoptim max_quality quality: 4; jpegrecompress quality: 80
this yields 20-40% compress images of the uncompressed JPGS

Option B
jpegoptim max_quality quality: 3; jpegrecompress quality: 60
this yields 10-20% compress images of the uncompressed JPGS

As far as I can tell, jpegoptim max_quality setting appears to have very little effect on the file size, where as the jpegrecompress quality setting has the most dramatic effect, especially on larger files. The values for jpegrecompress quality are 0-4, with 0 being the least quality (most savings) and 4 being the best quality. With a settle of 4, you can’t perceive any quality loss, but you don’t get the benefit of extremely optimized files. I recommend a setting of 3, which is barely noticeable in terms of quality loss but a significant boost in file size.

Test App

I threw together a test demo here. It lets you upload your own JPGs and see how they compress. It’s important to examine your own files, weighing the quality loss with the file size gain (that is, speed gain in having smaller file sizes).

https://image-optim-paperclip-exmp-41.herokuapp.com/assets

You can read the source of this demo app on Github.

Please note this Heroku (production) app is configured with a few extra goodies:

AWS setup for a basic Amazon S3 bucket
Postgres setup for Heroku

This app is configured to use an Amazon S3 bucket called jasonfb-example1. Because I pay for this bucket, please do not abuse. This demo app is provided for developer testing purposes only; I reserve the right to delete any images uploaded for any reason, including copyright infringement or simply lack-of-space. Please do not upload any inappropriate photos or photos you do not own.

You can hit the “Destroy” button on any image you upload.

The jpegoptim max_quality and the jpegrecompress max_quality

You’ll notice my example app here creates 5 different versions, using the same jpegoptim setting (jpegoptim: { allow_lossy: true, strip: :all, max_quality: 75 }, but 5 different quality settings on the jpegrecompress setting (be sure to note the jpegrecompress takes a quality parameter of 0-4; the jpegoptim setting takes a max_quality setting of 0-100)

In my example app I’ve split the settings for jpegrecompress and jpegoptim into a global setting and a per-style setting. Its setup differs from the examples above.

In my sample app, I’ve set the jpegoptim max_quality setting to 75 and created five different jpegrecompress settings: 0, 1, 2, 3, and 4, named:

optimized_compress_0
optimized_compress_1
optimized_compress_2
optimized_compress_3
optimized_compress_4

(you’ll see these in the has_attached_file in app/models/asset.rb)

So go ahead, upload a color-rich un-optimized image. In my experiments, I found that quality settings 4, 3, 2, and 1 yield approximately the same file size, with only a small dip in file size when you went down to 0.

However, the noticeable loss in quality begins to happen even at quality setting 3, so it seems to me why not use quality setting 4. You will be baking in an automatic guard against very large un-optimized images coming into your app. You’ll need to play around with these two settings.

Important Addendum (2017-03-09)

I am adding an important addendum to this post. After switching around my buildpacks on Heroku, I ran into a strange Sprockets error:

undefined method `dependency_digest’ for #<Sprockets::StaticAsset:0x007fefb93d0d28>

The only way I found to fix this was to purge my assets in slug compilation. This will mean your 1st push after purging will take an extra long time to slug compile.

If you run into that error, do this before you push to your environment:

heroku repo:purge_cache -a appname

Also see this Stack overflow post. I corresponded with the maintainer of Sprockets regarding this issue, and he suggested later versions of Sprockets may have addressed this issue (we are on Rails 4.1 with Sprockets 2.12.4).

Feb 152017
 

In your Google Trusted Store set-up, there is a process where you need to use a special link to validate your GTS badge.

You find this special link in the popup after the blue “Test” button where you see your store listed. Here, you see a panel called “Browsers to test” and an instruction to “Copy and paste this URL into your browser window”

Once you do this, you’ll see a beige bar like so:

This lets you preview your GTS integration so they can certify your website in their program. Annoyingly, this bar appears not to go away by itself, nor can I find a way to disable it in GTS.

To remove it, you must remove the cookies in your browser associated with

googlecommerce.com

and

www.googlecommerce.com

In Chrome, go to Advanced > Content Settings > All Cookies and Data and search for the specific domains above.

Then delete those cookies completely from your browser.

Feb 122017
 

My colleagueReid Cooper and I discovered a nice little trick of controller concerns, something we sometimes call “behaviors” in our app (typically implemented as modules). We found a trick from this link that lets us mix in behavior into both a controller and view helper, but first a brief introduction to controller concerns.

Concerns were born in Rails 4 as a nod to the limitations, vis-a-vi the domain model, of a a “strict” interpretation MVC as implemented by Rails. Around 2012 or 2013, most experience Rails developers would explain that the MVC structure created by default doesn’t necessarily dictate a strict MVC paradigm. Thanks in part to DCI architecture -- which complements but does not replace MVC -- a more modern understanding of larger apps includes a domain layer, i.e., where you put the business domain that is not in the traditional Rails models.

There are various options, and in a small nod to the problem the Rails core team added a blank empty folder to default Rails installs. You might notice this folder at app/controllers/concerns. What, the Rails newbie says, am I to do with a blank empty folder?

Good question. You would do well do study the excellent work of Sandi Metz and James Coplien, who cover domain abstraction (and a specific pattern the latter calls “DCI,” or domain-context interaction) in two excellent books (POODR and Lean Archtecture, respectively). The scope of these is well beyond this blog post, but since they are such my heroes I want to take an opportunity to plug these excellent books.

Reid and I wanted a behavior, a-la “concern”, that we could mix into a controller to inherit instance methods for the controller. We also wanted a view helper automagically mixed into our views for the view to access while it is rendering. To the rescue: the obscure included hook that gets called after modules are included into controllers, where you can re-access the controller itself and add both helper and actions (formerly known as filters.)

app/controllers/abc_controller.rb

class AbcController < ApplicationController
 include FancyConcern
 
 def index

 end 
end

and here’s the magic, in app/controllers/concern/fancy_concern.rb

module FancyConcern
 def self.included(base)
  # http://www.railstips.org/blog/archives/2009/05/15/include-vs-extend-in-ruby/
  base.helper FancyConcernViewHelper
  base.before_action :set_my_instance_variable
 end
 
 def set_my_instance_variable
  @my_instance_variable = “_instance variable value_”
 end 
end

app/views/abc/index.html

Hello world!

<fieldset>
<legend>An instance variable set in a before_filter</legend>
<%= @my_instance_variable %>
</fieldset>

<fieldset>
<legend>A call to the view</legend>
<%= my_view_helper_method %>
</fieldset>

Check out the full test app.

I don’t have a demo up & running, but it works (I took a screenshot below). If you want you can pull it locally and run it yourself to see.

 Posted by at 11:14 am  Tagged with: